Abstract

Abstract In this paper, we make five contributions to the literature on information and entropy in generalized method of moments (GMM) estimation. First, we introduce the concept of the long run canonical correlations (LRCCs) between the true score vector and the moment function f ( v t , θ 0 ) and show that they provide a metric for the information contained in the population moment condition E [ f ( v t , θ 0 ) ] = 0 . Second, we show that the entropy of the limiting distribution of the GMM estimator can be written in terms of these LRCCs. Third, motivated by the above results, we introduce an information criterion based on this entropy that can be used as a basis for moment selection. Fourth, we introduce the concept of nearly redundant moment conditions and use it to explore the connection between redundancy and weak identification. Fifth, we analyse the behaviour of the aforementioned entropy-based moment selection method in two scenarios of interest; these scenarios are: (i) nonlinear dynamic models where the parameter vector is identified by all the combinations of moment conditions considered; (ii) linear static models where the parameter vector may be weakly identified for some of the combinations considered. The first of these contributions rests on a generalized information equality that is proved in the paper, and may be of interest in its own right.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.