Abstract

Information criteria for model choice are extended to the detection of outliers in regression models. For deletion of observations (hard trimming) the family of models is generated by monitoring properties of the fitted models as the trimming level is varied. For soft trimming (downweighting of observations), some properties are monitored as the efficiency or breakdown point of the robust regression is varied. Least Trimmed Squares and the Forward Search are used to monitor hard trimming, with MM- and S-estimation the methods for soft trimming. Bayesian Information Criteria (BIC) for both scenarios are developed and results about their asymptotic properties provided. In agreement with the theory, simulations and data analyses show good performance for the hard trimming methods for outlier detection. Importantly, this is achieved very simply, without the need to specify either significance levels or decision rules for multiple outliers.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call