Abstract

Orthogonal regression extends the classical regression framework by assuming that the data may contain errors in both the dependent and independent variables. Often, this approach tends to outperform classical regression in real-world scenarios. However, the algorithms used to determine a solution to the orthogonal regression problem require the computation of singular value decompositions (SVD), which may be computationally expensive and impractical for real-world problems. In this work, we propose a new approach to the orthogonal regression problem based on a regularized squared loss. The method follows an online learning strategy which makes it more flexible for different types of applications. The algorithm is derived in primal and dual variables and the later formulation allows the introduction of kernels for nonlinear modeling. We compare our proposed orthogonal regression algorithm to a corresponding classical regression algorithm using both synthetic and real-world datasets from different applications. Our algorithm achieved better results for most of the datasets.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.