Abstract

For use in autonomous micro air vehicles, visual sensors must not only be small, lightweight and insensitive to light variations; on-board autopilots also require fast and accurate optical flow measurements over a wide range of speeds. Using an auto-adaptive bio-inspired Michaelis–Menten Auto-adaptive Pixel (MAPix) analog silicon retina, in this article, we present comparative tests of two optical flow calculation algorithms operating under lighting conditions from to W·cm (i.e., from 0.2 to 12,000 lux for human vision). Contrast “time of travel” between two adjacent light-sensitive pixels was determined by thresholding and by cross-correlating the two pixels’ signals, with measurement frequency up to 5 kHz for the 10 local motion sensors of the MAPix sensor. While both algorithms adequately measured optical flow between 25 /s and 1000 /s, thresholding gave rise to a lower precision, especially due to a larger number of outliers at higher speeds. Compared to thresholding, cross-correlation also allowed for a higher rate of optical flow output (99 Hz and 1195 Hz, respectively) but required substantially more computational resources.

Highlights

  • Lighting was provided by three sources: daylight coming through the top windows, fluorescent tubes attached to the ceiling and an LED projector that could be positioned in front of the moving pattern for high luminosity conditions

  • For each light level, the optical flow (OF) measurements of the 10 local motion sensor (LMS) obtained during sinusoidal periods of the moving pattern were overlaid on only one sinusoidal period in order to show all of the measurements in one figure

  • In the field of robotics, working in a seven-decade range of irradiance is a considerable advantage because robots may be led to work in environments subject to strong lighting variations, such as obstacle forests, urban canyons or inside buildings

Read more

Summary

Introduction

To detect and avoid obstacles in unpredictable environments, flying insects rely heavily on optical flow (OF) [1], defined as the vector field of angular velocities of contrasted points, edges or surfaces resulting from the relative motion between the observer and the surrounding objects [2,3].Their OF-based strategies provide inspiration for the development of smart autopilots for micro-air-vehicles (MAVs) [1,4,5] and smart artificial retinas [6,7,8,9,10,11,12].Insect-sized MAVs are increasingly becoming a reality [13,14,15,16,17,18] and have to be fitted in the future with sensors and flight control devices enabling them to perform all kinds of aerial maneuvers, including ground and obstacle avoidance, terrain-following and landing.The fitted sensors should be non-emissive to allow an MAV to save energy resources and to be stealth in flight, but must guarantee a high refresh rate because GPS (GPS stands for the Global Positioning System) signals are limited in both spatial (∼1 m) and temporal resolution (∼7 Hz). To detect and avoid obstacles in unpredictable environments, flying insects rely heavily on optical flow (OF) [1], defined as the vector field of angular velocities of contrasted points, edges or surfaces resulting from the relative motion between the observer and the surrounding objects [2,3]. Their OF-based strategies provide inspiration for the development of smart autopilots for micro-air-vehicles (MAVs) [1,4,5] and smart artificial retinas [6,7,8,9,10,11,12]. Phase Alternating Line) -camera with 720 × 576 pixels at 25 fps [16] or a CMOS (CMOS stands for Complementary Metal-Oxide Semiconductor) camera with 752 × 480 pixels at 80 fps [19]), including visual-based simultaneous localization and mapping algorithms (SLAM).

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call