Abstract
The present paper contributes on how to model maintenance decision support for the rail components, namely on grinding and renewal decisions, by developing a framework that provides an optimal decision map. A Markov decision process (MDP) approach is followed to derive an optimal policy that minimizes the total costs over an infinite horizon depending on the different condition states of the rail. A practical example is explored with the estimation of the Markov transition matrices (MTMs) and the corresponding cost/reward vectors. The MDP states are defined in terms of rail width, height, accumulated million gross tons (MGT) and damage occurrence. The optimal policy represents a condition-based maintenance plan with the aim of supporting railway infrastructure managers to take the best maintenance decision among a set of three possible actions depending on the state of the rail. Overall, the optimal policy requires railway infrastructure companies to have a tight control over their assets, in particular railway lines, in order to constantly monitor the actual condition of the rails.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: ASCE-ASME Journal of Risk and Uncertainty in Engineering Systems, Part A: Civil Engineering
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.