Attempting to Download Substructures for Coronal P56 Mouse Atlas

Hello,

I am a grad student and I have been struggling to correctly use the Reference Space API with the Coronal Atlas of the P56 Mouse Nissl Stains dataset. I have been working with the ‘reference_space.ipynb’ file that is provided as an example to try download only substructures of the atlas (e.g. Isocortex or Primary Motor Cortex, link to file: http://alleninstitute.github.io/AllenSDK/_static/examples/nb/reference_space.html).

The reference files seem like they are structured in a way that all three of the atlases for Atlas ID 602630314 (3D coronal average template), 1 (Coronal), 2 (Sagittal) are all grouped into the same reference space (indicated by 1 in this case). I have only been able to download a 10x and 100x downsampled reference atlas for the 3D coronal atlas. My first question is if there is a way to get the full resolution of the reference atlases in the reference space (seems to be only 10, 25, 50, 100 micron resolution options which are all downsampled when the true resolution is 8K by 11.4K images)?

From the documentation of the API, it seems like I need to know the name of the nrrd file corresponding to the atlases for the dataset of Atlas ID 1 (potentially something like P56_Mouse_annotation_XX.nrrd where ‘XX’ might be the micron resolution). I have found a document about the 3D atlas release (http://download.alleninstitute.org/informatics-archive/october-2014/mouse_expression/Accessing_October_2014_expression_data.pdf) but have had no luck trying to import the file into a ipython environment. I am able to download all of the nissl images and full atlas images in jpg form but I am interested in only a subsection of the atlases (and potentially and easy way to navigate between different structures in python). My next best alternative would be to use the atlases as a stencil for the intended ROIs but hopefully there is a better solution!

Thank you for your time and help in this matter. I look forward to your response!

Sincerely,
TJ LaGrow

Hi TJ,
To answer your first question, the highest resolution version of the reference space and annotation that exists is 10 um. You are correct that the original images are higher resolution (in x and y), but the template brain was built from the deformed, registered image data which exists at only in “voxel” space at 10x10x10 um resolution or lower.

A potential source of confusion is the fact that we have two older p56 mouse brain atlases that were drawn in 2D, plus the newer 3D atlas. Atlas ID 602630314 refers to the 3D atlas, which you can computationally section in any plane. Atlas IDs 1 and 2 are also based on the p56 mouse brain, but were drawn in 2D in the coronal and sagittal planes, respectively. This page has additional information about our atlas drawings and ontologies Atlas Drawings and Ontologies - Allen Brain Atlas API.

You can load the 3D atlas into python and access the annotated structures with the following code snippet:

from allensdk.core.mouse_connectivity_cache import MouseConnectivityCache
manifest_file = ‘…/connectivity/mouse_connectivity_manifest.json’
resolution = 10
mcc = MouseConnectivityCache(manifest_file=manifest_file, resolution=resolution)
reference_space = mcc.get_reference_space()

retrieve a single section from the reference space (this method wants an axis and a position in microns: axis 0 = coronal, 1 = axial, 2 = sagittal)

slice_image = reference_space.get_slice_image(0, 7500)

view a coronal section from the reference space

import matplotlib.pyplot as plt
plt.imshow(slice_image)

view a structure mask for primary visual cortex (VISp)

visp_mask, _ = mcc.get_structure_mask(385)
plt.imshow(visp_mask[:, :, 125].T, interpolation=‘none’)

The code above is taken from a Jupyter notebook that may be useful for you: https://github.com/AllenInstitute/SfN_2017_Connectivity/blob/master/Connectivity_workshop_solutions.ipynb.

If you want to work with the older, 2D atlas, you can download it using this code (this will take a little while to download if you need 10 um resolution):

from allensdk.api.queries.mouse_connectivity_api import MouseConnectivityApi
mcapi = MouseConnectivityApi()
reference_space = mcapi.download_annotation_volume(ccf_version = ‘annotation/ccf_2016’, resolution = 10, file_name = ‘…connectivity/annotation_2016.nrrd’)

The code for the download_annotation_volume method is here in case you need to refer to it: https://github.com/AllenInstitute/AllenSDK/blob/master/allensdk/api/queries/reference_space_api.py

Hopefully this will get you started, but feel free to follow up with more questions. It may help to give a little more detail about how you are trying to use the downloaded atlas so I can think about the most efficient way for you to get the data you need.

Good luck!
Jennifer

2 Likes

Another approach is to access mouse reference atlas structure annotation meshes by downloading the structure_meshes (.OBJ files) here:
http://download.alleninstitute.org/informatics-archive/current-release/mouse_ccf/annotation/ccf_2017/structure_meshes/

These kinds of files can even be used for creative applications like 3D printing your own CCF brain!

Hi, Jennifer. What you described in your last reply caused me to find this thread when searching for some basic information. I am looking for meshes that represent the cortical surface and head surface for the 3D mouse atlas. I don’t need the detailed contents of the atlas – just surfaces for now. We want provide a scalp recording electrode array that is registered to the atlas so that our collaborators can build volume conductor models that bridge the gap between the scalp recorded signals and the atlas. I see the OBJ files, so I guess I can download and assemble all of those, but that is a lot more detail than we need at this stage. Any suggestions?

I have a related question. I am searching for some basic information on the references atlases. I would like data acquisition and image metadata for the Nissl stained sections uploaded for P7, P28 and P56. What was the section thickness (I read 20 microns in some places and 25 in others)? Separation between sections? And are all data downsampled to 1pixel per micron? I was hoping to use the data as an independent sample using Cavalieri estimation.

A related question is whether it is possible to readout volume information from the 3d data viewer.
Thanks.

Is it possible to download the raw images for 602630314, 1 and 2 aswell?

I tried using this code, but the images and shapes annotation shapes are completely different


from allensdk.api.queries.image_download_api import ImageDownloadApi
image_api = ImageDownloadApi()
atlas_image_id = 1
downsample = 1
annotation = False
image_api.download_atlas_image(atlas_image_id, annotation=annotation, downsample=downsample)



# atlas_image_id = 112282603
# downsample = 6
# file_path = '112282603_nissl.jpg'
atlas_id = 1
atlas_images = image_api.atlas_image_query(atlas_id)


atlas_image_id = atlas_images[0]["id"]
# atlas_image_id = 100960392 
# atlas_image_id = 100960520
downsample = 1
file_path = f'{atlas_image_id}.jpg'

annotation = False

image_api.download_atlas_image(atlas_image_id, file_path, annotation=annotation, downsample=downsample)


def verify_image(file_path, figsize=(18, 22)):
    image = imread(file_path)

    fig, ax = plt.subplots(figsize=figsize)
    ax.imshow(image)
    return image
image = verify_image(file_path)

print(f"xyz: {image.shape[0:2]},{len(atlas_images)}")