Abstract

Discriminative least squares regression (DLSR) aims to learn relaxed regression labels to replace strict zero-one labels. However, the distance of the labels from the same class can also be enlarged while using the ε-draggings technique to force the labels of different classes to move in the opposite directions, and roughly persuing relaxed labels may lead to the problem of overfitting. To solve above problems, we propose a low-rank discriminative least squares regression model (LRDLSR) for multi-class image classification. Specifically, LRDLSR class-wisely imposes low-rank constraint on the relaxed labels obtained by non-negative relaxation matrix to improve its within-class compactness and similarity. Moreover, LRDLSR introduces an additional regularization term on the learned labels to avoid the problem of overfitting. We show that these two improvements help to learn a more discriminative projection for regression, thus achieving better classification performance. The experimental results over a range of image datasets demonstrate the effectiveness of the proposed LRDLSR method. The Matlab code of the proposed method is available at https://github.com/chenzhe207/LRDLSR.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.