Abstract
The least-squares principle was invented by Karl Gauss at the end of the 18th century for determining the orbits of planets. Since then, this method has become a major tool for parameter estimation using experimental data. Most existing parametric-identification methods can be related to the least-squares method. This method is easy to comprehend and easy to implement because of the existence of a closed solution. The least-squares method is also called linear regression (in statistical literature) and the equation-error method (in identification literature). This chapter introduces the principle of least squares. The least-squares technique is a mathematical procedure by which the unknown parameters of a mathematical model are chosen (estimated) such that the sum of the squares of some chosen error is minimized. This method is applied to the estimation of finite impulse response (FIR) models and parametric models. In the chapter, the test of this method on two industrial processes is discussed: stand rolling mill and a glass-tube-production process. The method was successful for the first process but failed for the second one. The chapter discusses the reasons for the failure of the least-squares method through theoretical analysis.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: Multivariable System Identification For Process Control
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.