Abstract

Optic flow, i.e., retinal image movement resulting from ego-motion, is a crucial source of information used for obstacle avoidance and course control in flying insects. Optic flow analysis may prove promising for mobile robotics although it is currently not among the standard techniques. Insects have developed a computationally cheap analysis mechanism for image motion. Detailed computational models, the so-called elementary motion detectors (EMDs), describe motion detection in insects. However, the technical application of EMDs is complicated by the strong effect of local pattern contrast on their motion response. Here we present augmented versions of an EMD, the (s)cc-EMDs, which normalise their responses for contrast and thereby reduce the sensitivity to contrast changes. Thus, velocity changes of moving natural images are reflected more reliably in the detector response. The (s)cc-EMDs can easily be implemented in hardware and software and can be a valuable novel visual motion sensor for mobile robots.

Highlights

  • When a mobile robot or an animal moves, the images of the environment move on its cameras’ sensors or on its eyes’ retinae, respectively

  • This new model was developed predominantly with a focus on usability in robotics. It implements dynamic normalisation of the response amplitude of the elementary motion detectors (EMDs) with respect to the local contrast of the input image by an approximative computation of the correlation coefficient of the signals of adjacent photoreceptors. We show that this augmentation largely reduces all modulations of the response of an EMD array unrelated to velocity, making the signals potentially more useful for the control of mobile robots

  • Flying insects use optic flow information for course stabilisation, obstacle avoidance and navigation. They extract and analyse this information in their tiny brain using a relatively computationally cheap process [6,7,9]. This process is based on local motion estimates computed in elementary motion detectors (EMDs)

Read more

Summary

Introduction

When a mobile robot or an animal moves, the images of the environment move on its cameras’ sensors or on its eyes’ retinae, respectively. These image movements, termed optic flow, can be a valuable source of information about both the ego-motion of the agent and the spatial structure of the environment [1]. The optic flow generated by translatory movements reflects the distance of objects in the environment because the images of objects close to the moving observer move faster on the sensor than those of more distant objects. Computer vision approaches to image motion estimation typically involve iterative smoothing processes which make the process computationally expensive [2,3]

Methods
Results
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.