Abstract

All physical systems must reliably extract information from their noisy and partially observable environment and build an internal representation of space to orient their behaviour. Precise egomotion estimation is important to keep external (i.e. environmental information) and internal (i.e. proprioception) cues coherent. The constructed representation subsequently defines the space of possible actions. Due to the multimodal nature of incoming streams of sensory information, egomotion estimation is a challenging sensor fusion problem. In this paper we present a distributed cortically inspired processing scheme for sensor fusion, which given various sensory inputs, and simple relations defining inter-sensory dependencies, relaxes into a solution which provides a plausible interpretation of the perceived environment. The proposed model has been implemented for egomotion estimation on an autonomous mobile robot. We demonstrate that the model provides a precise estimate of both robot position and orientation.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call