Abstract

Symmetric loss functions are widely used in regression algorithms to focus on estimating the means. Huber loss, a symmetric smooth loss function, has been proved that it can be optimized with high efficiency and certain robustness. However, mean estimators may be poor when the noise distribution is asymmetric (even outliers caused heavy-tailed distribution noise) and estimators beyond the means are necessary. Under the circumstances, quantile regression is a natural choice which estimates quantiles instead of means through asymmetric loss functions. In this paper, an asymmetric Huber loss function is proposed to implement different penalty for overestimation and underestimation so as to deal with more general noise. Moreover, a smooth truncated version of the proposed loss is introduced to enhance stronger robustness to outliers. Concave-convex procedure is developed in the primal space with the proof of convergence to handle the non-convexity of the involved truncated objective. Experiments are carried out on both artificial and benchmark datasets and robustness of the proposed methods are verified.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call