Abstract
Confocal fluorescent microscopy is a major tool to investigate the molecular orchestration of biomedical samples. The quality of the image acquisition depends critically on the tissue quality and thickness, the type, and concentration of antibodies used, as well as on microscope parameters. Due to these factors, intra-sample and inter-sample variability inevitably arises. Segmentation and quantification of targeted proteins can thus become a challenging process. Image processing techniques need therefore to address the acquisitions variability to minimize the risk of biases originating from changes in signal intensity, background noise, and parameterization. Here, we introduce PaFSe, a parameter-free segmentation algorithm for 3D fluorescent images. The algorithm is based on our established PRAQA approach, which evaluates the dispersion of several pixel intensity neighborhoods allowing for a statistical assessment whether individual subfields of an image can be considered as positive signal or background. PaFSe extends PRAQA by a fully automatic estimate for the segmentation parameters, and thereby provides a completely parameter-free and robust segmentation algorithm. By comparing PaFSe with Ilastik on synthetic examples, we show that our method achieves similar performances as a supervised approach in low-to-moderate noise environments without the need of tedious training. Furthermore, we validate the efficiency of PaFSe by segmenting and quantifying the abundance of hyperphosphorylated Tau protein in post-mortem human brain samples from Alzheimer’s disease patients and age-matched controls, where we obtain quantification values highly correlated with manual neuropathological segmentation. PaFSe is a parameter-free, fast, and adaptive approach for robust segmentation and quantification of protein abundance from complex 3D fluorescent images and is freely available at https://doi.org/10.17881/j20h-pa27.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.