Abstract

AbstractThe control problem is now a fairly common one in various technical, economic and social applications. In order to obtain good control, it is necessary to construct a model that meets the conditions of adequacy and accuracy. For this purpose it is necessary to obtain a solution to the identification problem of the constructed model.When modeling objects of such applications we usually use multidimensional autoregressive equations with regressors located at adjacent nodes of spatial coordinates. Obviously, in this case there is usually significant correlation dependence between regressors. There appears the effect of quasi-multicollinearity resulting in overestimation of autoregressive parameter estimates standard error and bias of obtained parameter estimates. Thus the presence of two error components makes it necessary to find a trade-off between bias and variance, which is well known in machine learning. We focus on multiple autoregressive equations based on approximation of homogeneous partial differential equations with constant parameters by difference equations with conservativity property. A difference scheme is called conservative if it preserves the same conservation laws on the grid as in the original differential problem. A comparative analysis of the solution of the parametric identification problem using ordinary least squares method (LSM), Ridge Regression and two author's methods of dimensionality reduction has been carried out in the frame of this paper. Both positive and negative parameters have been considered. A comparative analysis of the application of the investigated identification methods to parameter estimation has shown a significant dependence of the estimation quality on the observation noise intensity. At low noise all methods successfully cope with the identification problem. The author's method of dimensionality reduction based on taking into account the conservatism property of the difference scheme displays satisfactory efficiency when the noise intensity increases, in the case of positive coefficients. In the case of negative coefficients there is no positive effect of the method in question.KeywordsAutoregressive modelFinite difference equationsIdentificationOLS estimatesBiased estimates of model parametersThe reduction of the dimensionality

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.