Life cycle assessment (LCA) studies are frequently used to evaluate the environmental burdens of pavement facilities. This information can be used by decision-makers to advise their construction and maintenance policies. Within the pavement life cycle, there are a variety of uncertainties, such as future traffic growth and pavement deterioration. Currently, there is a lack of research examining the use of LCA models that can simultaneously optimize construction and maintenance plans while accounting for several sources of uncertainty. This study presents an approach to LCA modeling that implements a sub-type of reinforcement learning (RL) algorithms called Q-learning. Q-learning offers a model-free approach that can efficiently manage stochastic problems of parametric and non-parametric form. The algorithm iteratively learns a set of near-optimal decision rules to proactively manage pavement assets for a diverse range of possible future scenarios. These decision-rules are stored in a convenient look-up table, which will appeal to practitioners for its ease of use in probabilistic LCA studies. This paper subsequently tests the performance of the Q-learning approach across three representative case studies with varying traffic volumes: a local street-highway, a state highway, and an interstate. The case study results show that, on average, the proposed algorithm reduces the expected global warming impact of pavement infrastructure between 13% and 18% over a 50-year analysis period. Based on our results, Q-learning is a promising approach that can help decision-makers account for several sources of uncertainty and implement improved management strategies to mitigate the environmental impacts of their products and systems.
Read full abstract