Abstract

Multi sensor data fusion is the process of deriving more accurate, consistent and useful information by integrating data from multiple sensors than that can be achieved using individual sensor. Conventional methods for data fusion require complex models with computationally expensive methods. This paper proposes a deep neural network based method for multi sensor data fusion using detections from both Radar and IFF sensors. A deep learning based Long Short Term Memory (LSTM) model is used for data fusion. LSTM models overcome the problem of exploding and vanishing gradients present in Recurrent Neural Networks. Bi-directional LSTM is used for forming tracks from sensor observations using proposed association. The proposed fusion system architecture can be extended to sensors other than Radar and IFF also. The tracks generated by using detections from both sensors are closer to ground truth than tracks generated from individual sensors. Data fusion from multiple sensors improves the accuracy of a wide range of applications such as target tracking and battlefield surveillance.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call