Abstract

Emulating the highly resource-efficient processing of visual motion information in the brain of flying insects, a bio-inspired controller for collision avoidance and navigation was implemented on a novel, integrated System-on-Chip-based hardware module. The hardware module is used to control visually-guided navigation behavior of the stick insect-like hexapod robot HECTOR. By leveraging highly parallelized bio-inspired algorithms to extract nearness information from visual motion in dynamically reconfigurable logic, HECTOR is able to navigate to predefined goal positions without colliding with obstacles. The system drastically outperforms CPU- and graphics card-based implementations in terms of speed and resource efficiency, making it suitable to be also placed on fast moving robots, such as flying drones.

Highlights

  • A prerequisite for autonomous behavior in mobile robots is the ability to navigate in cluttered terrain without colliding with obstacles

  • Nowadays autonomous mobile robots rely on active sensors or extensive computations (e.g. Lucas-Kanade optic flow computation, [2]) to acquire and process relevant environmental information

  • Insects—despite their relatively small body size—show a remarkable behavioral performance when navigating in cluttered environments with minimal energy and computational expenditure

Read more

Summary

Introduction

A prerequisite for autonomous behavior in mobile robots is the ability to navigate in cluttered terrain without colliding with obstacles. To assess the performance of the embedded processing platform in a realworld scenario the system is used to control visual collision avoidance and navigation behavior on the insect-inspired hexapod walking robot HECTOR (Fig 1A; [10, 11]).

Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call