Abstract

Kernel sliced inverse regression (KSIR) is a natural framework for nonlinear dimension reduction using the mapping induced by kernels. However, there are numeric, algorithmic, and conceptual subtleties in making the method robust and consistent. We apply two types of regularization in this framework to address computational stability and generalization performance. We also provide an interpretation of the algorithm and prove consistency. The utility of this approach is illustrated on simulated and real data.

Highlights

  • The goal of dimension reduction in the standard regression/ classification setting is to summarize the information in the p-dimensional predictor variable X relevant to predicting the univariate response variable Y

  • We prove the asymptotic consistency of the e.d.r. directions estimated by regularized kernel sliced inverse regression (KSIR) and provide conditions under which the rate of convergence is Op(n−1/4)

  • We compare the effectiveness of the linear dimension reduction methods sliced inverse regression (SIR), sliced average variance estimation (SAVE), and Principal Hessian directions (PHDs) with regularized kernel sliced inverse regression (RKSIR), by examining the predictive accuracy of a nonlinear kernel regression model on the reduced space

Read more

Summary

Introduction

The goal of dimension reduction in the standard regression/ classification setting is to summarize the information in the p-dimensional predictor variable X relevant to predicting the univariate response variable Y. The basic idea in applying kernel methods is the application of a linear algorithm to the data mapped into a feature space induced by a kernel function. Nonlinear extensions of some classical linear dimensional reduction methods using this approach include kernel principle component analysis [6], kernel Fisher discriminant analysis [8], and kernel independent correlation analysis [9] This idea was applied to SIR in [10, 11] resulting in the kernel sliced inverse regression (KSIR) method which allows for the estimation of nonlinear e.d.r. directions. In KSIR, the p-dimensional data are projected into a Hilbert space H through a feature map: φ : X → H and the nonlinear features are supposed to be recovered by the eigenfunctions of the following operator.

Regularized Kernel Sliced Inverse Regression
Application to Simulated and Real Data
Discussion
Proof of Theorem 3
Proof of Proposition 7
Proof of Consistency
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call