Abstract

This work proposes an approach for solving the linear regression problem by maximizing the dependence between prediction values and the response variable. The proposed algorithm uses the Hilbert-Schmidt independence criterion as a generic measure of dependence and can be used to maximize both nonlinear and linear dependencies. The algorithm is important in applications such as continuous analysis of affective speech, where linear dependence, or correlation, is commonly set as the measure of goodness of fit. The applicability of the proposed algorithm is verified using two synthetic, one affective speech, and one affective bodily posture datasets. Experimental results show that the proposed algorithm outperforms support vector regression (SVR) in 84% (264/314) of studied cases, and is noticeably faster than SVR, as an order of 25, on average.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call