Abstract

This paper presents a systematic scheme for fusing millimeter wave (MMW) radar and a monocular vision sensor for on-road obstacle detection. As a whole, a three-level fusion strategy based on visual attention mechanism and driver’s visual consciousness is provided for MMW radar and monocular vision fusion so as to obtain better comprehensive performance. Then an experimental method for radar-vision point alignment for easy operation with no reflection intensity of radar and special tool requirements is put forward. Furthermore, a region searching approach for potential target detection is derived in order to decrease the image processing time. An adaptive thresholding algorithm based on a new understanding of shadows in the image is adopted for obstacle detection, and edge detection is used to assist in determining the boundary of obstacles. The proposed fusion approach is verified through real experimental examples of on-road vehicle/pedestrian detection. In the end, the experimental results show that the proposed method is simple and feasible.

Highlights

  • For the engineering development of autonomous mobile robots such as autonomous ground vehicles (AGVs), and unmanned aerial vehicles (UAVs), real-time and reliable obstacle detection in their surroundings is a premise in order to execute precision navigation and control

  • This paper aims to integrate a MMW radar and a monocular camera for on-road obstacle detection, which is a part of our research work on AGV navigation under urban road environment conditions

  • Because MMW radar can be used in all-weather applications and its performance cannot be degraded by dusts and fogs, integration of MMW radar with vision sensors has been attracting more and more attention, for AGVs and for planetary rovers or military applications

Read more

Summary

Introduction

For the engineering development of autonomous mobile robots such as autonomous ground vehicles (AGVs), and unmanned aerial vehicles (UAVs), real-time and reliable obstacle detection in their surroundings is a premise in order to execute precision navigation and control. Due to their complex and dynamic working environments, mobile robots are necessarily equipped with different types of sensors to deal very well with the issues of environmental perception and recognition. Considering the all-weather working capability and powerful detection capability of MMW radar and the low cost of a monocular vision sensor, this paper attempts to construct a novel radar-vision fusion architecture for navigating mobile vehicles

Objectives
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call