. In a moderate or high-dimensional framework, the sliced inverse regression (SIR) method requires the inversion of the empirical covariance matrix which yields numerical problems in estimating the central subspace. In order to improve SIR, several methods based on the regularization of the covariance matrix or the use of principal component analysis (PCA) were proposed. Yet most of these select the eigen-directions with the largest eigenvalues in an unexplained way. This article circumvents these difficulties by suggesting a new regularization of SIR based on the singular value decomposition (SVD) of the data matrix of the predictors called sliced inverse regression via natural canonical thresholding (SIR-NCT). SIR-NCT makes it possible to relate the vector of new canonical regression coefficients of the reduced dimension to the vector of the initial regression coefficients of large dimension. Moreover, we use thresholding to cut the components associated with insignificant directions. Some theoretical results are presented for SIR-SVD. Experiments with simulated and real data show that SIR-NCT outperforms its competitors.