Abstract

The energy efficiency of port container terminal equipment and the reduction of CO2 emissions are among one of the biggest challenges facing every seaport in the world. The article presents the modeling of the container transportation process in a terminal from the quay crane to the stack using battery-powered Automated Guided Vehicle (AGV) to estimate the energy consumption parameters. An AGV speed control algorithm based on Deep Reinforcement Learning (DRL) is proposed to optimize the energy consumption of container transportation. The results obtained and compared with real transportation measurements showed that the proposed DRL-based approach dynamically changing the driving speed of the AGV reduces energy consumption by 4.6%. The obtained results of the research provide the prerequisites for further research in order to find optimal strategies for autonomous vehicle movement including context awareness and information sharing with other vehicles in the terminal.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call