Abstract

Confocal microscopy is a three-dimensional (3D) imaging modality, but the specimen thickness that can be imaged is limited by depth-dependent signal attenuation. Both software and hardware methods have been used to correct the attenuation in reconstructed images, but previous methods do not increase the image signal-to-noise ratio (SNR) using conventional specimen preparation and imaging. We present a practical two-view method that increases the overall imaging depth, corrects signal attenuation and improves the SNR. This is achieved by a combination of slightly modified but conventional specimen preparation, image registration, montage synthesis and signal reconstruction methods. The specimen is mounted in a symmetrical manner between a pair of cover slips, rather than between a slide and a cover slip. It is imaged sequentially from both sides to generate two 3D image stacks from perspectives separated by approximately 180 degrees with respect to the optical axis. An automated image registration algorithm performs a precise 3D alignment, and a model-based minimum mean squared algorithm synthesizes a montage, combining the content of both the 3D views. Experiments with images of individual neurones contrasted with a space-filling fluorescent dye in thick brain tissue slices produced precise 3D montages that are corrected for depth-dependent signal attenuation. The SNR of the reconstructed image is maximized by the method, and it is significantly higher than in the single views after applying our attenuation model. We also compare our method with simpler two-view reconstruction methods and quantify the SNR improvement. The reconstructed images are a more faithful qualitative visualization of the specimen's structure and are quantitatively more accurate, providing a more rigorous basis for automated image analysis.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.