Abstract

Mobility is fundamental for the wealth and health of the world’s population, and it has a significant influence on our daily life. However, with the increasing complexity of traffic, the need to transport goods, and growing urbanization, improving the quality of mobility in terms of time, space, and air becomes more challenging. An autonomous electric vehicle offers technology and potential for new mobility concepts in smart cities. Today, many vehicles have been developed with automated driving capabilities. A safety driver is still required to intervene in most cases if the autonomous electric vehicle is not able to handle a situation in a safe and, at the same time, reliable way. One important aspect to achieve safety and reliability goals is a robust and efficient perception of the vehicle’s environment. The tragic accident involving an Uber self-driving car killing a pedestrian in 2018 highlighted the importance of perception in autonomous driving. In its investigation, the U.S. National Transportation Safety Board found that the Uber self-driving car and its safety driver involved in the accident failed to detect the pedestrian at the same time. Since this accident, the autonomous vehicle industry has been working to improve perception systems through the use of advanced sensors, machine learning algorithms, and other technologies. A variety of sensor technologies are used in vehicles to detect objects and perceive the vehicles’ surroundings. Cameras and radars are among the widely used technologies for sensing systems, as they are cheap and reliable, and as these sensors are operating at different wavelengths, they are not susceptible to common errors. While some companies solely use cameras and radars, others also use lidars in their sensing systems. These sensors are widely used in the industry, and the technology itself is continuously enhanced, leading to new developments, such as 4D lidar, which are capable of measuring not only the distance of an object but also its velocity by evaluating the phase shift of the returned light. Various methods for the perception of the environment exist and rely on different kinds of sensors. Machine learning-based methods have evolved rapidly in recent years and are currently leading the field in perception, particularly in the tasks of object detection and classification.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.