Abstract

The generalized method of moments (GMM) estimator of the reduced-rank regression model is derived under the assumption of conditional homoscedasticity. It is shown that this GMM estimator is algebraically identical to the maximum likelihood estimator under normality developed by Johansen (1988). This includes the vector error correction model (VECM) of Engle and Granger. It is also shown that GMM tests for reduced rank (cointegration) are algebraically similar to the Gaussian likelihood ratio tests. This shows that normality is not necessary to motivate these estimators and tests.

Highlights

  • The vector error correction model (VECM) of Engle and Granger (1987) is one of the most widely used time-series models in empirical practice

  • The predominant estimation method for the VECM is the reduced-rank regression method introduced by Johansen (1988, 1991, 1995)

  • Johansen motivated his estimator as the maximum likelihood estimator (MLE) of the VECM

Read more

Summary

Introduction

The vector error correction model (VECM) of Engle and Granger (1987) is one of the most widely used time-series models in empirical practice. Johansen’s estimation method is widely used because it is straightforward, it is a natural extension of the VAR model of Sims (1980), and it is computationally tractable Johansen motivated his estimator as the maximum likelihood estimator (MLE) of the VECM under the assumption that the errors are i.i.d. normal. It is shown that Johansen’s reduced-rank estimator is algebraically identical to the generalized method of moments (GMM) estimator of the VECM, under the imposition of conditional homoscedasticity. This GMM estimator only uses uncorrelatedness and homoscedasticity.

Reduced-Rank Regression Models
Generalized Method of Moments
Derivation of the GMM Estimator
Extrema of Quadratic Forms
Y Y0 Y
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call