Abstract

Nuisance attribute projection (NAP) was an effective method to reduce session variability in SVM-based speaker verification systems. As the expanded feature space of nonlinear kernels is usually high or infinite dimensional, it is difficult to find nuisance directions via conventional eigenvalue analysis and to do projection directly in the feature space. In this paper, two different approaches to nonlinear kernel NAP are investigated and compared. In one way, NAP projection is formulated in the expanded feature space and kernel PCA is employed to do kernel eigenvalue analysis. In the second approach, a gradient descent algorithm is proposed to find out projection over input variables. Experimental results on the 2006 NIST SRE corpus show that both kinds of NAP can reduce unwanted variability in nonlinear kernels to improve verification performance; and NAP performed in expanded feature space using kernel PCA obtains slightly better performance than NAP over input variables.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call