Abstract

Although the least-squares regression (LSR) has achieved great success in regression tasks, its discriminating ability is limited since the margins between classes are not specially preserved. To mitigate this issue, dragging techniques have been introduced to remodel the regression targets of LSR. Such variants have gained certain performance improvement, but their generalization ability is still unsatisfactory when handling real data. This is because structure-related information, which is typically contained in the data, is not exploited. To overcome this shortcoming, in this article, we construct a multioutput regression model by exploiting the intraclass correlations and input-output relationships via a structure matrix. We also discriminatively enlarge the regression margins by embedding a metric that is guided automatically by the training data. To better handle such structured data with ordinal labels, we encode the model output as cumulative attributes and, hence, obtain our proposed model, termed structure-exploiting discriminative ordinal multioutput regression (SEDOMOR). In addition, to further enhance its distinguishing ability, we extend the SEDOMOR to its nonlinear counterparts with kernel functions and deep architectures. We also derive the corresponding optimization algorithms for solving these models and prove their convergence. Finally, extensive experiments have testified the effectiveness and superiority of the proposed methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call