Abstract

Estimation by minimizing the sum of squared residuals is a common method for parameters of regression functions; however, regression functions are not always known or of interest. Maximizing the likelihood function is an alternative if a distribution can be properly specified. However, cases can arise in which a regression function is not known, no additional moment conditions are indicated, and we have a distribution for the random quantities, but maximum likelihood estimation is difficult to implement. In this article, we present the least squared simulated errors (LSSE) estimator for such cases. The conditions for consistency and asymptotic normality are given. Finite sample properties are investigated via Monte Carlo experiments on two examples. Results suggest LSSE can perform well in finite samples. We discuss the estimator’s limitations and conclude that the estimator is a viable option. We recommend Monte Carlo investigation of any given model to judge bias for a particular finite sample size of interest and discern whether asymptotic approximations or resampling techniques are preferable for the construction of tests or confidence intervals.

Highlights

  • Minimizing the sum of squared residuals is a common estimation method for regression functions, but regression functions are not always known or of interest, when motivated by an underlying structural equation

  • The purpose of this article is to show the functioning of the least squared simulated errors (LSSE) estimator and not to address specific scientific problems: We cannot foresee the structural models that scientists will create in the future, and we focus on mathematical forms that challenge estimation rather than with their current scientific motivation

  • The LSSE estimator is consistent in sample size and number of simulation draws, and if the number of simulation draws rises faster than the square root of the samples size, it is asymptotically normal

Read more

Summary

Introduction

Minimizing the sum of squared residuals is a common estimation method for regression functions, but regression functions are not always known or of interest, when motivated by an underlying structural equation. If a distribution for random terms is specified, maximizing the likelihood function is an alternative estimation method that does not require an explicit regression function. Cases can arise in which the regression function is not known, no additional moment conditions are indicated, and we have a distribution for the random quantities, but maximum likelihood estimation is difficult to implement. Consider a simple example in which a response y to exogenous variable x for each person w in a population of interest is modeled as yw = α + (β ⋅ xw )δw , with x≥ 0, α> 0, β> 0, and δ∈ (0,1). The least squared simulated errors (LSSE) estimator presented below provides a simple means of estimating such models

Objectives
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.