Abstract

Statistical model checking avoids the exponential growth of states of numerical model checking, but rare properties are costly to verify. Importance sampling can reduce the cost if good importance sampling distributions can be found efficiently.Our approach uses a tractable cross-entropy minimisation algorithm to find an optimal parametrised importance sampling distribution. In contrast to previous work, our algorithm uses a naturally defined low dimensional vector to specify the distribution, thus avoiding an explicit representation of a transition matrix. Our parametrisation leads to a unique optimum and is shown to produce many orders of magnitude improvement in efficiency on various models. In this work we link the existence of optimal importance sampling distributions to logical properties and show how our parametrisation affects this link. We also motivate and present simple algorithms to create the initial distribution necessary for cross-entropy minimisation. Finally, we discuss the open challenge of defining error bounds with importance sampling and describe how our optimal parametrised distributions may be used to infer qualitative confidence.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call