Abstract
The present paper contributes on how to model maintenance decision support for the rail components, namely on grinding and renewal decisions, by developing a framework that provides an optimal decision map. A Markov decision process (MDP) approach is followed to derive an optimal policy that minimizes the total costs over an infinite horizon depending on the different condition states of the rail. A practical example is explored with the estimation of the Markov transition matrices (MTMs) and the corresponding cost/reward vectors. The MDP states are defined in terms of rail width, height, accumulated million gross tons (MGT) and damage occurrence. The optimal policy represents a condition-based maintenance plan with the aim of supporting railway infrastructure managers to take the best maintenance decision among a set of three possible actions depending on the state of the rail. Overall, the optimal policy requires railway infrastructure companies to have a tight control over their assets, in particular railway lines, in order to constantly monitor the actual condition of the rails.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have