Abstract

This chapter presents an overview of least-squares methods for the estimation of parameters by fitting experimental data. Least-squares methods produce the estimated parameters with the highest probability (maximum likelihood) of being correct if several critical assumptions are warranted. The chapter discusses several least-squares parameter estimation procedures and methods for the evaluation of confidence intervals for the determined parameters. It discusses the practical aspects of applying least-squares techniques to experimental data. The chapter provides an overview of several least-squares methods that can be applied to the evaluation of constants from experimental data. It outlines some of the least-squares methods available for evaluating the set of parameters with the highest probability of being correct in given a set of experimental data. Nonlinear least-squares analysis actually comprises a group of numerical procedures that can be used to evaluate the optimal values of the parameters in vector a for the experimental data. The chapter discusses inherent assumptions of least-squares methods. The chapter reviews several of the more common algorithms—the Gauss–Newton method and derivatives and the Nelder-Mead simplex method. The Gauss–Newton least-squares method is formulated as a system of Taylor series expansions of the fitting function. The Marquardt method is the most commonly used procedure for improving the convergence properties of the Gauss–Newton method.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call