Abstract

Real-time maintenance decision making in large manufacturing system is complex because it requires the integration of different information, including the degradation states of machines, as well as inventories in the intermediate buffers. In this paper, by using a discrete time Markov chain (DTMC) model, we consider the real-time maintenance policies in manufacturing systems consisting of multiple machines and intermediate buffers. The optimal policy is investigated by using a Markov Decision Process (MDP) approach. This policy is compared with a baseline policy, where the maintenance decision on one machine only depends on its degradation state. The result shows how the structures of the policies are affected by the buffer capacities and real-time buffer levels.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.