Abstract

Ordinal regression is one of the most influential tasks of supervised learning. Support vector ordinal regression (SVOR) is an appealing method to tackle ordinal regression problems. However, due to the complexity in the formulation of SVOR and the high cost of kernel computation, traditional SVOR solvers are inefficient for large-scale training. To address this problem, in this paper, we first highlight a special SVOR formulation whose thresholds are described implicitly, so that the dual formulation is concise to apply the state-of-the-art asynchronous parallel coordinate descent algorithm, such as AsyGCD. To further accelerate the training for SVOR, we propose two novel asynchronous parallel coordinate descent algorithms, called AsyACGD and AsyORGCD respectively. AsyACGD is an accelerated extension of AsyGCD using active set strategy. AsyORGCD is specifically designed for SVOR that it can keep the ordered thresholds when it is training so that it can obtain good performance with lower time. Experimental results on several large-scale ordinal regression datasets demonstrate the superiority of our proposed algorithms.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.