Abstract
This study considers the statistical estimation of relations presented by implicit functions. Such structures define mutual interconnections of variables rather than outcome variable dependence by predictor variables considered in regular regression analysis. For a simple case of two variables, pairwise regression modeling produces two different lines of each variable dependence using another variable, but building an implicit relation yields one invertible model composed of two simple regressions. Modeling an implicit linear relation for multiple variables can be expressed as a generalized eigenproblem of the covariance matrix of the variables in the metric of the covariance matrix of their errors. For unknown errors, this work describes their estimation by the residual errors of each variable in its regression by the other predictors. Then, the generalized eigenproblem can be reduced to the diagonalization of a special matrix built from the variables’ covariance matrix and its inversion. Numerical examples demonstrate the eigenvector solution’s good properties for building a unique equation of the relations between all variables. The proposed approach can be useful in practical regression modeling with all variables containing unobserved errors, which is a common situation for the applied problems.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.