Abstract

We formulate and evaluate the Automated Simulation Analysis Procedure (ASAP), an algorithm for steady-state simulation output analysis based on the method of nonover-lapping batch means (NOBM). ASAP delivers a confidence interval for an expected response that is centered on the sample mean of a portion of a simulation-generated time series and satisfies a user-specified absolute or relative precision requirement. ASAP operates as follows: The batch size is progressively increased until either (a) the batch means pass the von Neumann test for independence, and then ASAP delivers a classical NOBM confidence interval; or (b) the batch means pass the Shapiro-Wilk test for multivariate normality, and then ASAP delivers a correlation-adjusted confidence interval. The latter adjustment is based on an inverted Cornish-Fisher expansion for the classical NOBM t-ratio, where the terms of the expansion are estimated via an autoregressive-moving average time series model of the batch means. After determining the batch size and confidence-interval type, ASAP sequentially increases the number of batches until the precision requirement is satisfied. An extensive experimental study demonstrates the performance improvements achieved by ASAP versus well-known batch means procedures, especially in confidence-interval coverage probability.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call