Allen Human Reference Atlas - 3D, 2020 (new!)

ALLEN HUMAN REFERENCE ATLAS - 3D, 2020

Version 1.0.0

The Allen Human Reference Atlas - 3D is a parcellation of the adult human brain in 3D, labeling every voxel with a brain structure spanning 141 structures. These parcellations were drawn by Song-Lin Ding, and adapted from his prior 2D version of an adult human brain atlas.

These parcellations were drawn on the MRI reference brain volume “ICBM 2009b Nonlinear Symmetric”, a non-linear average of the MNI152 database of 152 normal brain images. The iterative procedure results in an average that combines both high-spatial resolution and signal-to-noise and not subjected to any particularity of any single brain. To obtain the reference volume, please refer to the McConnell Brain Imaging Center website for download and terms of use available from BIC - The McConnell Brain Imaging Centre: ICBM 152 N Lin 2009; Copyright (C) 1993–2004 Louis Collins, McConnell Brain Imaging Centre, Montreal Neurological Institute, McGill University.


A 2D sagittal cross-section overlaid on MNI space and 3D medial and lateral rendering of the new human parcellation volume visualized using ITK-SNAP application.

USAGE

Example python scripts have been included to demonstrate:

  • how to download structure information and ontology from the Allen Brain API
  • use of the annotation volume in context of a hierarchical ontology
  • creation of ITK-SNAP compatible files to visualize the parcellation in 2D and 3D

TERMS OF USE

These materials are now provided under the Attribution International 4.0 (CC BY 4.0) license as of Sept. 1, 2022, which is available at Creative Commons — Attribution 4.0 International — CC BY 4.0.

CITATION

Citation of these materials should conform to the Allen Institute Citation Policy: Citation Policy

Allen Human Reference Atlas - 3D, 2020 Citation information:

Resource Name: Allen Human Reference Atlas – 3D, 2020
Version: 1.0.0
Research Resource Identifier (RRID): RRID:SCR_017764
Copyright notice: © 2019 Allen Institute for Brain Science
Dataset citation: Song‐Lin Ding, Joshua J. Royall, Susan M. Sunkin, Benjamin A.C. Facer, Phil Lesnar, Amy Bernard, Lydia Ng, Ed S. Lein (2020). “Allen Human Reference Atlas – 3D, 2020," RRID:SCR_017764, version 1.0.0.
Available from: Index of /informatics-archive/allen_human_reference_atlas_3d_2020/version_1

The anatomic structure ontology adapted for use in the 3D atlas was based on the 2016 version of the 2D atlas, published in final form here.

Publication Citation for structure ontology: Ding, S.L., Royall, J.J., […], Lein, E.S. Comprehensive cellular‐resolution atlas of the adult human brain. Journal of Comparative Neurology, Volume 524:16, pages 3127–3481, 1 November 2016, DOI 10.1002/cne.24080.

SUPPORT

These materials are provided as-is, without direct support. Community discussion around this resource is available here at https://community.brain-map.org/.

ACKNOWLEDGEMENTS

Creation of the 3D parcellation volume was supported by the Allen Institute for Brain Science, and by the National Institute of Mental Health under Award Number 1U01MH114812-01 (PI: Ed Lein, Allen Institute for Brain Science).

The 2D atlas and anatomic structural ontology upon which it was based was supported by the Allen Institute for Brain Science, and by the National Institute of Mental Health under Award Number RC2MH089921 (PIs: Ed Lein & Michael Hawrylycz, Allen Institute for Brain Science).

3 Likes

That’s awesome!

Could you share a link to the example python script please, I don’t seem to able to find them on the API and SDK support pages.

Thank you,
Federico

Hi Federico, thanks for your interest!
The scripts are in the download directory at the moment - check the “examples” folder:

http://download.alleninstitute.org/informatics-archive/allen_human_reference_atlas_3d_2020/version_1/

Hi Carol. my name is Gaston Zanitti. I am a PhD student/researcher working for the “Institut national de recherche en sciences et technologies du numérique” (INRIA) of France.

This work looks very interesting! I would like to use this information in a project that we are developing, but I would like to know if there is a way to relate the voxels with the refinements of the regions that the ontology offers. As far as I could see in the examples, it’s only possible to relate 140 regions per hemisphere.

Is there any way to distinguish the sub-regions of the ontology? Or did you make some other work in this direction?

Thank you very much!

Hi Gaston,

Hope you are safe and well, and thank you for your question!

This release represents a very deep anatomical dive relative to the resolution of this template volume (0.5mm/voxel). Further delineation is in theory possible, but the returns would be starkly diminishing. Averaging artifacts, modest intrinsic anatomical contrast, and a lack of directly registered supporting data are among other reasons our natural inclination to delineate deeper were tempered. Reaching our totals after an original survey suggested a ~100 structure maximum was a challenge.

To meaningfully relate sub-regions of the ontology not present in the new 3D atlas, we suggest utilizing the interactive adult human reference atlase in parallel. It carries all the classic features of a high-resolution atlas (and more) while employing the same ontology. http://atlas.brain-map.org/atlas?atlas=138322605#atlas=138322605. If there is a particular area or analysis or region that you had in mind that requires consultation, please let us know.

Josh

Are there files for the other hemisphere as well?

Best,
Jonah

A double-sided version of the annotation is now available for download.

http://download.alleninstitute.org/informatics-archive/allen_human_reference_atlas_3d_2020/version_1/

1 Like

Hi,
is there a specific reason why the structure IDs refer to the developing brain atlas ontology (graph id 16) and not to the adult human ontology (graph id 10)? Is there an easy liftover from graph 16 to graph 10 ids, or can we just match acronyms/names?
Thanks!

Hi @gdagstn,

The BrainSpan references atlases is a cohesive set references across multiple developmental timepoints including adult using multi-modal data under the guidance of Drs Song-Lin Ding and Gulgun Sengul. This the same annotation framework that Dr Song-Lin Ding carried over to the 3D. See whitepaper for details

The “adult human ontology” is associated with the Allen Human Brain Atlas (a project which preceded the BrainSpan project). The “adult human ontology” was developed Dr Angela Guillozet-Bongaarts to be a practical neuroanatomical guide for sampling and was not intended to be view as a definitive reference atlas.

Is there an easy liftover from graph 16 to graph 10 ids, or can we just match acronyms/names?

In cases like this, I feel the best practice is to abide each anatomist’s intent.
Both ontologies were annotated on the same set of Nissl to support this analysis. So it is possible to compare the SVGs to see what overlaps. Note that you are likely to find 1 to N relationships and also structure from one only partially overlapping the structure of the other.

  • “Human, 34 years, Cortex - Gyral” 138322605
  • "“Human, 34 years, Cortex - Mod. Brodmann” 265297126
  • "Human Brain Atlas Guide " 265297125
1 Like

Hello,
Thank you so much for sharing this.
The python script is beautiful!

Do you think that a Nifti version (similar to the Gyral) but for the Brodmann segmentation is possible?
Even at a lower resolution?
Honestly, it would be amazing for the scientific community!

Clément

Hello,

Does anyone know where to get the anatomical names for each region. When I download the STL files, they have things like:
oldb0_226
oldb0_227
oldBrain_centre
oldSphere_003.



I am looking for a list that maps the names in the download listed above with the proper anatomical names for the brain region.

Thanks,
Reuben

@rhk12

http://download.alleninstitute.org/informatics-archive/allen_human_reference_atlas_3d_2020/version_1/examples/voxel_count/

Take a look at the voxel_count example that shows you how to connect to the API to get structure acronym and name. Or skip to just looking at the voxel_count.csv file for spreadsheet version.

1 Like

@lydian thank you. One more follow up question. I think your response was pertaining to the nifti data? I was actually talking about the ITK Snap data with the surfaces already segmented out and export. During the export step 5) here (Human Brain Atlas Mesh Files - #2 by nileg) each region is exported as a different file which can be loaded up into blender. When this happens the shapes lose their names. I can manually look at itksnap and the blender file and determine the region, I was just checking if this list already existed. Thanks,

@rhk12

I see - you will need to join a couple of data files together.

This file label description file has some of the information you need.

The first column is the label id, the should correspond to the number in your filename (+ leading zeros). The last column is the “structure_acronym - database_id”. You can use the acronym or datasbase id to index into the voxel_count.csv file to get fullname.

1 Like

Hello,

Thanks for uploading the atlas files! I was interested in trying a Freesurfer recon-all like stream with the human brain atlas. I was looking around the Freesurfer wiki (CorticalParcellation - Free Surfer Wiki SubcorticalSegmentation - Free Surfer Wiki), but wasn’t sure how to “substitute” the default cortical parcellation and/or volumetric segmentation (aparc+aseg.mgz or other .mgz outputs) with the human reference atlas (which is shared in an ITK-SNAP compatible format).

Has anyone reading this thread been able to implement this atlas into their Freesurfer workflow?

Thanks in advance!

Hi Carol, hi Josh, thank you for providing this atlas.
I have been working with the AHBA dataset and my analysis builds upon the ontology that is provided.

I was hoping to relate my results with this 3D atlas which I thought was based on the same ontology, but to my surprise there are some differences. For example, the hippocampus is divided along the long axis (head / body / tail) in the 3D atlas nomenclature, whereas in the AHBA ontology it divides along the short axis into fields CA1-2-3-4. It seems to not simply be a question of ‘big regions’ and ‘smaller regions’, here there is no consistent overlap. Or is it that the resolution of the template would not allow for subfield delineation along the short axis?
I face a similar problem elsewhere, for example with the regions of the cerebellum.

I’m wondering if I shouldn’t rerun my analyses using a standard parcellation (HCPMMP or Schaefer maybe), instead of the ontology labels.

Before I go down that route, do you know of a way that could reconcile the 3D brain atlas labels and the AHBA ontology at a given level of granularity?

Thanks for your feedback,
Chloe

Hi Chloe,

If I understand, you’re working with this ontology, and you’re trying to map those brain structures to this ontology.

We do not currently have a mapping table between those ontologies. We’re working hard to improve the human brain atlas and ontology overall, and I’ll pass along the feedback to those teams to relate their new work back to these versions.

(If I misunderstood your request, please let me know!)

Tyler

Hi Tyler,
Thank you for your answer!
Quick check, I think both ontologies have the CA1-2-3-4 division of the hippocampus, unlike what appears in the 3D atlas. So I think the problem is slightly different, I guess I was wondering if there was an atlas in MNI space with the exact correspondance to the ontology, but I suppose that the closest is already this one Allen Human Reference Atlas - 3D, 2020 (new!)

For now, I’ve reaggregated the microarray samples to parcels of an atlas instead of aggregating with respect to the ontology.

Thanks!
Chloe

Hi Chloe,

As a semi-related comment, you might also find the MRI data from the Allen Human Brain Atlas useful. No direct links to ontology, but you can download specific coordinates of each tissue block from the microarray data as well (MNI coordinates are included as part of the “SampleAnnot.csv” file that comes with the zip file download for each donor in the top section). There are ~150 total samples from the CA regions of hippocampus across six donors.

Best,
Jeremy