Abstract

BackgroundThere is a challenge to determine stereotaxic coordinates of a target structure with the accuracy, comparable to their size, when imaging the rat brain through cranial windows using confocal (multiphoton) microscopy in vivo. Some methods based on the estimation of the linear displacement from the intersections of the cerebral vessels are most often used for these purposes, but their accuracy can be improved. New methodA new method for converting pixel coordinates of points of interest on an image obtained in two-photon microscopy into stereotaxic ones using quadratic approximation with L2 regularization has been developed. A comparative analysis of several methods for converting pixel coordinates into stereotaxic ones was carried out. The current study is aimed to select a method which minimizes the error of coordinate conversion within the a priori specified threshold value. ResultsA method for determining the stereotaxic coordinates of each pixel in an image obtained by laser scanning in two-photon and / or confocal modes with an accuracy of several tens of microns is proposed. Comparison with existing method(s)It is shown that the error probability of the most common method based on calculating the points of interest coordinates as displacements relative to the selected vessels intersections can be reduced by using the quadratic approximation with L2 regularization. Our proposed method allows us to improve the accuracy of determining the coordinates of points of interest on 10–30 µm. ConclusionsThe proposed approach will be useful in research where precise positioning of microelectrodes, sensors, etc. for implantation in specified brain structures or groups of neurons determined by functional mapping is required.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.