Congestion pricing has become an effective instrument for traffic demand management on road networks. This paper proposes an optimal control approach for congestion pricing for day-to-day timescale that incorporates demand uncertainty and elasticity. Travelers make the decision to travel or not based on the experienced system travel time in the previous day and traffic managers take tolling decisions in order to minimize the average system travel time over a long time horizon. We formulate the problem as a Markov decision process (MDP) and analyze the problem to see if it satisfies conditions for conducting a satisfactory solution analysis. Such an analysis of MDPs is often dependent on the type of state space as well as on the boundedness of travel time functions. We do not constrain the travel time functions to be bounded and present an analysis centered around weighted sup-norm contractions that also holds for unbounded travel time functions. We find that the formulated MDP satisfies a set of assumptions to ensure Bellman’s optimality condition. Through this result, the existence of the optimal average cost of the MDP is shown. A method based on approximate dynamic programming is proposed to resolve the implementation and computational issues of solving the control problem. Numerical results suggest that the proposed method efficiently solves the problem and produces accurate solutions.
Read full abstract