Abstract
BackgroundInvestigation of neural circuit functioning often requires statistical interpretation of events in subthreshold electrophysiological recordings. This problem is non-trivial because recordings may have moderate levels of structured noise and events may have distinct kinetics. In addition, novel experimental designs that combine optical and electrophysiological methods will depend upon statistical tools that combine multimodal data. New methodWe present a Bayesian approach for inferring the timing, strength, and kinetics of post-synaptic currents (PSCs) from voltage-clamp electrophysiological recordings on a per event basis. The simple generative model for a single voltage-clamp recording flexibly extends to include additional structure to enable experiments designed to probe synaptic connectivity. ResultsWe validate the approach on simulated and real data. We also demonstrate that extensions of the basic PSC detection algorithm can handle recordings contaminated with optically evoked currents, and we simulate a scenario in which calcium imaging observations, available for a subset of neurons, can be fused with electrophysiological data to achieve higher temporal resolution. Comparison with existing methodsWe apply this approach to simulated and real ground truth data to demonstrate its higher sensitivity in detecting small signal-to-noise events and its increased robustness to noise compared to standard methods for detecting PSCs. ConclusionsThe new Bayesian event analysis approach for electrophysiological recordings should allow for better estimation of physiological parameters under more variable conditions and help support new experimental designs for circuit mapping.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.