Abstract

System models are in fact nonlinear regression models, in that they relate response variables to a set of explanatory variables, so it is important to understand how such regression models are treated in statistics. Here we explain the two main approaches to parameter estimation in regression, namely least squares, where parameters are chosen to minimize the sum of squared errors, and maximum likelihood, where parameters are chosen to maximize the likelihood of obtaining the observed data. The three fundamental assumptions for ordinary least squares are that the model is correct, model error variance is constant, and model errors are independent. These assumptions are however often not satisfied for system models. For each assumption we consider when it is likely to be violated, how to identify cases where it is violated and how one can treat those cases. We also show in detail how to use the two R functions nls and optim, which can be used to estimate parameters in nonlinear regression.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call