Abstract
Many algorithms that provide approximate solutions for dynamic stochastic general equilibrium (DSGE) models employ the QZ factorization because it allows a flexible formulation of the model and exempts the researcher from identifying equations that give raise to infinite eigenvalues. We show, by means of an example, that the policy functions obtained by this approach may differ from both the solution of a properly reduced system and the solution obtained from solving the system of nonlinear equations that arises from applying the implicit function theorem to the model's equilibrium conditions. As a consequence, simulation results may depend on the specific algorithm used and on the numerical values of parameters that are theoretically irrelevant. The sources of this inaccuracy are ill-conditioned matrices as they emerge, e.g., in models with strong habits. Researchers should be aware of those strange effects, and we propose several ways to handle them.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.