Abstract

This paper examines lag selection problem in unit root tests which has become a major specification problem in empirical analysis of non-stationary time series data. It is known that the implementation of unit root tests requires the choice of optimal truncation lag for good power proper ties and it is equally unrealistic to assume that the true optimal truncation lag is known a prior to the practitioners and other applied researchers. Consequently, these users rely largely on the use of standard information criteria for selection of truncation lag in unit root tests. A number of previous studies have shown that these criteria have problem of over-specification of truncation lag-length leading to the well-known low power problem that is commonly associated with most unit root tests in the literature. This paper focuses on the problem of over-specification of truncation lag-length within the context of augmented Dickey-Fuller (ADF) and generalized least squares Dickey-Fuller (DF-GLS)unit root tests. In an attempt to address this lag selection problem, we propose a new criterion for the selection of truncation lag in unit root tests based on Koyck distributed lag model and we show that this new criterion avoids the problem of over-specification of truncationlag-length that is commonly associated with standard information criteria.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call