Abstract

In this paper, I show how gradient-based optimization methods can be used to estimate stochastic dynamic models in economics. By extending the state space to include all model parameters, I show that we need to solve the model only once to do structural estimation. Parameters are then estimated by minimizing the distance between key empirical moments and the model-implied ones. Unlike the Simulated Method of Moments, the model-implied moments are estimated without the computation of a single moment. Instead, a neural network learns the corresponding moments using raw simulated observations. Once a network learned the (differentiable) mapping between parameters and moments, a Newton-Raphson routine is coupled with simulated annealing to find the set of parameters that globally minimizes the objective function. I illustrate the algorithm by solving and estimating a benchmark macroeconomic model with stochastic volatility, endogenous labor supply, and irreversible investment.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call