Abstract

Advances in three‐dimensional electron microscopy (EM) have facilitated the collection of image stacks with a field‐of‐view that is large enough to cover a significant percentage of anatomical subdivisions at nano‐resolution. When coupled with enhanced staining protocols, such techniques produce data that can be mined to establish the morphologies of all organelles across hundreds of whole cells in their in situ environments. Although instrument throughputs are approaching terabytes of data per day, image segmentation and analysis remain significant bottlenecks in achieving quantitative descriptions of whole cell organellomes. Here we describe computational workflows that achieve the automatic segmentation of organelles from regions of the central nervous system by applying supervised machine learning algorithms to slices of serial block‐face scanning EM (SBEM) datasets. We also demonstrate that our workflows can be parallelized on supercomputing resources, resulting in a dramatic reduction of their run times. These methods significantly expedite the development of anatomical models at the subcellular scale and facilitate the study of how these models may be perturbed following pathological insults.Grant Funding Source: Supported by NIGMS under awards number P41 GM103412‐25 and P41 GM103426, the NINDS under award number 1R01NS07531, and NIDA under award number 5T32DA007315‐10.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call