Abstract

Markov chains with small transition probabilities occur while modeling the reliability of systems where the individual components are highly reliable and quickly repairable. Complex inter-component dependencies can exist and the state space involved can be huge, making these models analytically and numerically intractable. Naive simulation is also difficult because the event of interest (system failure) is rare, so that a prohibitively large amount of computation is needed to obtain samples of these events. An earlier paper (Juneja et al., 2001) proposed an importance sampling scheme that provides large efficiency increases over naive simulation for a very general class of models including reliability models with general repair policies such as deferred and group repairs. However, there is a statistical penalty associated with this scheme when the corresponding Markov chain has high probability cycles as may be the case with reliability models with general repair policies. This paper develops a splitting-based importance-sampling technique that avoids this statistical penalty by splitting paths at high probability cycles and thus achieves bounded relative-error in a stronger sense than in previous attempts.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.