Abstract

AbstractExplicit signal coordination carries prior knowledge of traffic engineering and is widely accepted for global implementation. With the recent popularity of reinforcement learning, numerous researchers have turned to implicit signal coordination. However, these methods inevitably require learning coordination from scratch. To maximize the use of prior knowledge, this study proposes an explicit coordinated signal control (ECSC) method using a soft actor–critic for cycle length determination. This method can fundamentally solve the challenges encountered by traditional methods in determining the cycle length. Soft actor–critic was selected among various reinforcement learning methods. A single agent was administered to the arterials. An action is defined as the selection of a cycle length from among the candidates. The state is represented as a feature vector, including the cycle length and features of each leg at every intersection. The reward is defined as departures that indirectly minimize system vehicle delays. Simulation results indicate that ECSC significantly outperforms the baseline methods, as evident in system vehicle delay across nearly all demand scenarios and throughput in high demand scenarios. The ECSC revitalizes explicit signal coordination and introduces new perspectives on the application of reinforcement learning methods in signal coordination.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.