Abstract

Human beings perceive the physical world by integrating various sensory inputs, information about own motor system and their knowledge. To process sensory information, the human brain has a hierarchical parallel distributed processing mechanism. Sensor fusion technology focuses on simulating the brain's sensory information processing mechanism and is intended for advanced sensing systems which cannot be constructed with unimodal sensory information processing.Our study of sensor fusion aims to develop a hierarchical sensory-motor fusion mechanism for achieving intentional sensing, which is a concept that sensing has a goal of perception (or intention of sensing) and sensing behaviors must be oriented to achieve the goal.In this paper, we propose a hierarchical sensory-motor fusion model with neural networks for intentional sensing, and also propose an iterative inversion method which takes advantage of multi-layer neural networks as a solution to the ill-posed inverse problem. We applied the hierarchical sensory-motor fusion model to a three-dimensional object recognition system and a vision-based robot arm control system, and demonstrated the effectiveness of the proposed model by computer simulations. We confirmed that the model accepts and propagates intentions, enables to tightly couple recognition and action, and can perform various tasks without rebuilding or relearning the sensing system.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.