Abstract

Today, perception solutions for Automated Vehicles rely on sensors on board the vehicle, which are limited by the line of sight and occlusions caused by any other elements on the road. As an alternative, Vehicle-to-Everything (V2X) communications allow vehicles to cooperate and enhance their perception capabilities. Besides announcing its own presence and intentions, services such as Collective Perception (CPS) aim to share information about perceived objects as a high-level description. This work proposes a perception framework for fusing information from on-board sensors and data received via CPS messages (CPM). To that end, the environment is modeled using an occupancy grid where occupied, and free and uncertain space is considered. For each sensor, including V2X, independent grids are calculated from sensor measurements and uncertainties and then fused in terms of both occupancy and confidence. Moreover, the implementation of a Particle Filter allows the evolution of cell occupancy from one step to the next, allowing for object tracking. The proposed framework was validated on a set of experiments using real vehicles and infrastructure sensors for sensing static and dynamic objects. Results showed a good performance even under important uncertainties and delays, hence validating the viability of the proposed framework for Collective Perception.

Highlights

  • As Advance Driving Assistance Systems (ADAS) evolve, more sensor technologies are embedded within commercial vehicles for environment perception

  • Weather conditions do not affect all technologies in the same manner, e.g., radars have a better performance than cameras when facing rain or fog, and cameras are much more sensitive to lighting conditions

  • Despite containing high-level description of the objects perceived by external sources, Collective Perception Message (CPM) data is fused into the occupancy grid and not into the multi-object tracking directly

Read more

Summary

Introduction

As Advance Driving Assistance Systems (ADAS) evolve, more sensor technologies are embedded within commercial vehicles for environment perception. Transportation Systems (ITS) applications, where one of the most representative successful examples is the Cooperative Awareness Message (CAM) This message standardizes the dissemination of the ego state, including information such as position, speed, heading or vehicle type. In [2] the Environmental Perception Message (EPM) was proposed for describing static and moving objects, besides adding information about the ego-vehicle and the sensors on board. ETSI is working in the standardization of the Collective Perception Message (CPM) [3], which in its current version considers information about objects and sensors and about the free space. The environment is modeled using Dynamic Occupancy Grids (DOG) where occupied, and free and uncertain space is considered This is done to consider sensors measurements and their associated uncertainty.

Related Work
Perception Framework
Pre-Processing Task
Instant Occupancy Grid
Dynamic Occupancy Grid
Object-Level Tracking
V2X Communications
Enhanced Perception
CPM’s Pre-Processing Step
CPM Inclusion into the Occupancy Grid
CPM Inclusion into the Dynamic Occupancy Grid
Experiments
Analysis of Reliability for CPM State Prediction
Real Scenario 1
Real Scenario 2
Conclusions and Future Works
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.