Abstract

Autonomous Vehicles (AVs) have advanced rapidly in recent years as they promise to be safe and minimize the burden coming from the driving task. AVs share the road with various categories of vehicles as Emergency Vehicles (EMVs) (e.g police and ambulance vehicles). When being approached by an active EMV, it is natural to expect all vehicles to cooperate with EMV, such that the EMV travel time is minimized. The decision-making block of an AV includes the responsibility of instructing the AV to change lanes, which is typically handled by the Lane Change Decision (LCD) model. A typical LCD model tends to overlook the presence of EMVs around, as they neglect the impact of the lane change on the EMV utility. To address this challenge, this paper proposes an Emergency Vehicle Aware LCD via utilizing Deep Reinforcement Learning. To our best knowledge, this is one of the pioneering works that propose a DRL solution for the problem, addressing important limitations that have been identified. The proposed solution was evaluated against a rule-based LCD known as MOBIL in terms of safety and level of cooperativeness with the EMV. Some key results found from the comparison between the proposed solution and MOBIL are (1) identical safety levels,(2) proposed solution is takes far less time to give up the lane when being approached by an EMV, and (3) proposed solution never blocks the path of the EMV, whereas MOBIL occasionally block the path.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call