Abstract
Brains are optimized for processing ethologically relevant sensory signals. However, few studies have characterized the neural coding mechanisms that underlie the transformation from natural sensory information to behavior. Here, we focus on acoustic communication in Drosophila melanogaster and use computational modeling to link natural courtship song, neuronal codes, and female behavioral responses to song. We show that melanogaster females are sensitive to long timescale song structure (on the order of tens of seconds). From intracellular recordings, we generate models that recapitulate neural responses to acoustic stimuli. We link these neural codes with female behavior by generating model neural responses to natural courtship song. Using a simple decoder, we predict female behavioral responses to the same song stimuli with high accuracy. Our modeling approach reveals how long timescale song features are represented by the Drosophila brain and how neural representations can be decoded to generate behavioral selectivity for acoustic communication signals.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.