Abstract

Least squares regression (LSR) is widely applied in statistics theory due to its theoretical solution, which can be used in supervised, semisupervised, and multiclass learning. However, LSR begins to fail and its discriminative ability cannot be guaranteed when the original data have been corrupted and noised. In reality, the noises are unavoidable and could greatly affect the error construction in LSR. To cope with this problem, a robust supervised LSR (RSLSR) is proposed to eliminate the effect of noises and outliers. The loss function adopts l2,p -norm ( ) instead of square loss. In addition, the probability weight is added to each sample to determine whether the sample is a normal point or not. Its physical meaning is very clear, in which if the point is normal, the probability value is 1; otherwise, the weight is 0. To effectively solve the concave problem, an iterative algorithm is introduced, in which additional weights are added to penalize normal samples with large errors. We also extend RSLSR to robust semisupervised LSR (RSSLSR) to fully utilize the limited labeled samples. A large number of classification performances on corrupted data illustrate the robustness of the proposed methods.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.