Abstract
Moving a sensor through its environment creates signature time variations of the sensor’s readings often referred to as flow cues. We analyze the acoustic flow field generated by a sonar sensor, capable of imaging the full frontal hemisphere, mounted on a mobile platform. We show how the cues derived from this acoustic flow field can be used directly in a layered control strategy, which supports a robotic platform to perform a set of motion primitives, such as obstacle avoidance, corridor following, and negotiating corners and T-junctions. The programmable nature of the spatial sampling pattern of the sonar allows efficient support of the varying information requirements of the different motion primitives. The proposed control strategy is first validated in a simulated environment and subsequently transferred to a real mobile robot. We present simulated and experimental results on the controller’s performance while executing the different motion primitives. The results further show that the proposed control strategy can easily integrate minimal steering commands given by a user (electric wheelchair application) or by a high-level navigation module (autonomous simultaneous localization and mapping (SLAM) applications).
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.