Abstract
It is very difficult for visually impaired people (VIP) to perceive and avoid obstacles at a distance. To address this problem, we propose a sensor fusion system, which combines the RGB-depth (RGB-D) sensor and millimeter wave (MMW) radar sensor, to perceive the surrounding obstacles. The position and velocity information of the multiple targets are detected by the MMW radar based on the principle of frequency modulated continuous wave. The depth and position information of the obstacles are verified by the RGB-D sensor based on the MeanShift algorithm. The data fusion based on the joint probabilistic data association algorithm and Kalman filter enable the navigation assistance system to obtain more accurate state estimates compared with using only one sensor. The nonsemantic stereophonic interface is utilized to transfer the obstacle detection results to the VIP. The experiment results show that multiple objects with different ranges and angles are detected by the radar and the RGB-D sensor. The effective detection range is expanded up to 80 m compared to using only the RGB-D sensor. Moreover, the measurement results are stable under diverse illumination conditions. As a wearable system, the sensor fusion system has the characteristics of versatility, portability, and cost-effectiveness.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have