Abstract
The neural information at different scales exhibits spatial representations and the corresponding features are believed to be conducive for neural encoding. However, existing neural decoding studies on multiscale feature fusion have rarely been investigated. In this study, a multiscale neural information feature fusion framework is presented and we integrate these features to decode spatial routes from multichannel recordings. We design a goal-directed spatial cognitive experiment in which the pigeons need to perform a route selection task. Multichannel neural activities including spike and local field potential (LFP) recordings in the hippocampus are recorded and analyzed. The multiscale neural information features including spike firing rate features, LFP time-frequency energy features, and functional network connectivity features are extracted for spatial route decoding. Finally, we fuse the multiscale feature to solve the neural decoding problem and the results indicate that feature fusion operation improves the decoding performance significantly. Ten-fold cross-validation result analysis shows a promising improvement in the decoding performance using fusing multiscale features by an average of 0.04–0.11 at least than using any individual feature set alone. The proposed framework investigates the possibility of route decoding based on multiscale features, providing an effective way to solve the neural information decoding problems.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.