Abstract

In recent decades stereology-based studies have played a significant role in understanding brain aging and developing novel drug discovery strategies for treatment of neurological disease and mental illness. A major obstacle to further progress in a wide range of neuroscience sub-disciplines remains the lack of high-throughput technology for stereology analyses. Though founded on methodologically unbiased principles, commercially available stereology systems still rely on well-trained humans to manually count hundreds of cells within each region of interest (ROI). Even for a simple study with 10 controls and 10 treated animals, cell counts typically require over a month of tedious labor and high costs. Furthermore, these studies are prone to errors and poor reproducibility due to human factors such as subjectivity, variable training, recognition bias, and fatigue. Here we propose a deep neural network-stereology combination to automatically segment and estimate the total number of immunostained neurons on tissue sections. Our three-step approach consists of (1) creating extended-depth-of-field (EDF) images from z-stacks of images (disector stacks); (2) applying an adaptive segmentation algorithm (ASA) to label stained cells in the EDF images (i.e., create masks) for training a convolutional neural network (CNN); and (3) use the trained CNN model to automatically segment and count the total number of cells in test disector stacks using the optical fractionator method. The automated stereology approach shows less than 2% error and over 5× greater efficiency compared to counts by a trained human, without the subjectivity, tedium, and poor precision associated with conventional stereology.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.