Abstract
SUMMARY The point of change in mean in a sequence of normal random variables can be estimated from a cumulative sum test scheme. The asymptotic distribution of this estimate and associated test statistics are derived and numerical results given. The relation to likelihood inference is emphasized. Asymptotic results are compared with empirical sequential results, and some practical implications are discussed. The cumulative sum scheme for detecting distributional change in a sequence of random variables is a well-known technique in quality control, dating from the paper of Page (1954) to the recent expository account by van Dobben de Bruyn (1968). Throughout the literature on cumulative sum schemes the emphasis is placed on tests of departure from initial conditions. The purpose of this paper is to examine a secondary aspect: estimation of the index T in a sequence {xt}, where the departure from initial conditions has taken place. The work is closely related to an earlier paper by Hinkley (1970), in which maximum likelihood estimation and inference were discussed. We consider specifically sequences of normal random variables x1, ..., xT, say, where initially the mean 00 and the variance o2 are known. A cumulative sum, cusum, scheme is used to detect possible change in mean from 00, and for simplicity suppose that it is a one-sided scheme for detecting decrease in mean. Then the procedure is to compute the cumulative sums t
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.