Abstract
Synthesizing perceivable artificial neural inputs independent of typical sensory channels remains a fundamental challenge in the development of next-generation brain-machine interfaces. Establishing a minimally invasive, wirelessly effective, and miniaturized platform with long-term stability is crucial for creating a clinically meaningful interface capable of mediating artificial perceptual feedback. In this study, we demonstrate a miniaturized fully implantable wireless transcranial optogenetic encoder designed to generate artificial perceptions through digitized optogenetic manipulation of large cortical ensembles. This platform enables the spatiotemporal orchestration of large-scale cortical activity for remote perception genesis via real-time wireless communication and control, with optimized device performance achieved by simulation-guided methods addressing light and heat propagation during operation. Cue discrimination during operant learning demonstrates the wireless genesis of artificial percepts sensed by mice, where spatial distance across large cortical networks and sequential order-based analyses of discrimination performance reveal principles that adhere to general perceptual rules. These conceptual and technical advancements expand our understanding of artificial neural syntax and its perception by the brain, guiding the evolution of next-generation brain-machine communication.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.