Abstract
We consider linear regression models where both input data (the observations of independent variables) and output data (the observations of the dependent variable) are affected by loss of information caused by uncertainty, indeterminacy, rounding or censoring. Instead of real-valued (crisp) data, only intervals are available. We study a possibilistic generalization of the least squares estimator, so called OLS-set for the interval model. Investigation of the OLS-set allows us to quantify whether the replacement of real-valued (crisp) data by interval values can have a significant impact on our knowledge of the value of the OLS estimator. We show that in the general case, very elementary questions about properties of the OLS-set are computationally intractable (assuming P≠NP). We also focus on restricted versions of the general interval linear regression model to the crisp input case. Taking the advantage of the fact that in the crisp input – interval output model the OLS-set is a zonotope, we design both exact and approximate methods for its description. We also discuss special cases of the regression model, e.g. a model with repeated observations.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.