Mosquitoes rely on the integration of multiple sensory cues, including olfactory, visual, and thermal stimuli, to detect, identify, and locate their hosts [1-4]. Although we increasingly know more about the role of chemosensory behaviors in mediating mosquito-host interactions [1], the role of visual cues is comparatively less studied [3], and how the combination of olfactory and visual information is integrated in the mosquito brain remains unknown. In the present study, we used a tethered-flight light-emitting diode (LED) arena, which allowed for quantitative control over the stimuli, and a control theoretic model to show that CO2 modulates mosquito steering responses toward vertical bars. To gain insight into the neural basis of this olfactory and visual coupling, we conducted two-photon microscopy experiments in a new GCaMP6s-expressing mosquito line. Imaging revealed that neuropil regions within the lobula exhibited strong responses to objects, such as a bar, but showed little response to a large-field motion. Approximately 20% of the lobula neuropil we imaged were modulated when CO2 preceded the presentation of a moving bar. By contrast, responses in the antennal (olfactory) lobe were not modulated by visual stimuli presented before or after an olfactory stimulus. Together, our results suggest that asymmetric coupling between these sensory systems provides enhanced steering responses to discrete objects.
Read full abstract