Publisher Summary This chapter discusses the concept of parameter estimation in nonlinear regression models. As for linear models, the least squares method occurs as the most important estimation method because many favorable properties known from the linear theory can be at least asymptotically assured. The solution of the corresponding nonlinear optimization problem requires in general iterative methods. In concrete case studies, the selection of an appropriate procedure from the large set of iterative methods proposed in the literature, the choice of starting guesses, incremental change, and step size requires a lot of experience. The examples of nonlinear regression models are empirical growth curves, exponential models, and Coob–Douglas models. The least squares estimation is the main estimation method in nonlinear regression. The formulation as an inadequate least squares problem may be needed because of the fact that the numerical treatment may be much simpler or, looking on statistical efficiency, it could seem to be reasonable to diminish the dimension of parameters. However, the investigation of weighted inadequate least squares approximation (WILSA) is of basic importance, taking into consideration that the model choice problem in the nonlinear case is difficult. The chapter discusses the problem of approximating a nonlinear response function in a given linear set of functions, polynomial approximation, the consistency of least squares estimators, asymptotic distribution of least-squares estimators, asymptotic optimality of GLSE without normality, maximum likelihood estimation, robust nonlinear regression, and confidence regions of the model.
Read full abstract