Abstract
The original support vector machine (SVM) uses the hinge loss function, which is non-differentiable and makes the problem difficult to solve in particular for regularized SVM, such as with ℓ1-regularized. On the other hand, the hinge loss is sensitive to noise. To circumvent these drawbacks, a huberized pinball loss function is proposed. It is less sensitive to noise, similar to the pinball loss which is related to the quantile distance. The proposed loss function is differentiable everywhere and this differentiability can significantly reduce the computational cost for the SVM algorithm. The elastic net penalty is applied to the SVM and the support vector machine classifier with huberized pinball loss (HPSVM) is proposed. Due to the continuous differentiability of the huberized pinball loss function, the Proximal Gradient method is used to solve the proposed model. The numerical experiments on synthetic data, real world datasets confirm the robustness and effectiveness of the proposed method. Statistical comparison is performed to show the significant difference between the proposed method and other compered ones.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: Engineering Applications of Artificial Intelligence
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.