Abstract

Many animals detect other individuals effortlessly. In Drosophila, previous studies have examined sensory processing during social interactions using simple blobs as visual stimulation; however, whether and how flies extract higher-order features from conspecifics to guide behavior remains elusive. Arguing that this should be reflected in sensorimotor relations, we developed unbiased machine learning tools for natural behavior quantification and applied these tools, which may prove broadly useful, to study interacting pairs. By transforming motor patterns with female-centered reference frames, we established circling, where heading and traveling directions intersect, as a unique pattern of social interaction during courtship. We found circling to be highly visual, with males exhibiting view-tuned motor patterns. Interestingly, males select specific wing and leg actions based on the positions and motions of the females' heads and tails. Using system identification, we derived visuomotor transformation functions indicating history-dependent action selection, with distance predicting action initiation and angular position predicting wing choices and locomotion directions. Integration of vision with somatosensation further boosts these sensorimotor relations. Essentially comprised of orchestrated wing and leg maneuvers that are more variable in the light, circling induces mutually synchronized conspecific responses stronger than wing extension alone. Finally, we found that actions depend on integrating spatiotemporally structured features with goals. Altogether, we identified a series of sensorimotor relations during circling, implying that during courtship, flies detect complex spatiotemporally structured features of conspecifics, laying the foundation for a mechanistic understanding of conspecific recognition in Drosophila.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call