Abstract
In this paper, we consider the problem of robotic motion tracking and following with neuromorphic vision sensors. We formulate the problem in a leader-follower paradigm. The objective of the follower robot is to perform real-time motion segmentation of a scene and follow the leader robot. Motion segmentation using a neuromorphic vision sensor mounted on a mobile robot is a challenging task due to events created by movements of the platform (self-movement). Current approaches for tracking do not perform well during sensor ego-motion or need a priori knowledge about the object being tracked. To address these limitations, we designed an algorithm based on clustering space-time events induced by a neuromorphic sensor followed by a classification procedure. This technique is based on a distance transformation of existing sets. After clustering, a binary class label is assigned to each: (1) background or (2) moving object. The classifier uses event rates of clusters to determine proper class labels. The proposed technique forms an important module for the creation of collectively intelligent multi-pedal robots that utilize neuromorphic vision sensors. The utility and robustness of our algorithm is demonstrated as a real-time online system implemented on two hexapod robots.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.