Abstract

The neural information at different scales exhibits spatial representations and the corresponding features are believed to be conducive for neural encoding. However, existing neural decoding studies on multiscale feature fusion have rarely been investigated. In this study, a multiscale neural information feature fusion framework is presented and we integrate these features to decode spatial routes from multichannel recordings. We design a goal-directed spatial cognitive experiment in which the pigeons need to perform a route selection task. Multichannel neural activities including spike and local field potential (LFP) recordings in the hippocampus are recorded and analyzed. The multiscale neural information features including spike firing rate features, LFP time-frequency energy features, and functional network connectivity features are extracted for spatial route decoding. Finally, we fuse the multiscale feature to solve the neural decoding problem and the results indicate that feature fusion operation improves the decoding performance significantly. Ten-fold cross-validation result analysis shows a promising improvement in the decoding performance using fusing multiscale features by an average of 0.04–0.11 at least than using any individual feature set alone. The proposed framework investigates the possibility of route decoding based on multiscale features, providing an effective way to solve the neural information decoding problems.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call