Abstract

In surveillance and monitoring systems, the use of mobile vehicles or unmanned aerial vehicles (UAVs), like the drone type, provides advantages in terms of access to the environment with enhanced range, maneuverability, and safety due to the ability to move omnidirectionally to explore, identify, and perform some security tasks. These activities must be performed autonomously by capturing data from the environment; usually, the data present errors and uncertainties that impact the recognition and resolution in the detection and identification of objects. The resolution in the acquisition of data can be improved by integrating data sensor fusion systems to measure the same physical phenomenon from two or more sensors by retrieving information simultaneously. This paper uses the constant turn and rate velocity (CTRV) kinematic model of a drone but includes the angular velocity not considered in previous works as a complementary alternative in Lidar and Radar data sensor fusion retrieved using UAVs and applying the extended Kalman filter (EKF) for the detection of moving targets. The performance of the EKF is evaluated by using a dataset that jointly includes position data captured from a LiDAR and a Radar sensor for an object in movement following a trajectory with sudden changes. Additive white Gaussian noise is then introduced into the data to degrade the data. Then, the root mean square error (RMSE) versus the increase in noise power is evaluated, and the results show an improvement of 0.4 for object detection over other conventional kinematic models that do not consider significant trajectory changes.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call