Abstract

Single sensor, such as 3D LiDAR camera, has relatively limited perception performance of providing comprehensive environmental information though the perception results from single sensor is accurate. Therefore, multiple sensors are perferred for surveillant tasks in either tactical or civilian scenarios. Cooperative perception is one of the solution to enable sensors to share sensory information with other sensors and infrastructure, extending coverage and enhancing the detection accuracy of surrounding objects for better safety and path planning. However, an efficient management of the large volume of sensory data across multiple sensors in the wirelss network is needed to maintain real-time sensing. In this work, we design a complete cooperative perception framework with varies networking, image processing and data fusion technologies integrated to enhance the situational awareness performance with multiple sensors. The framework uses information-centric networking and deep reinforc

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call