Abstract

Sufficient dimension reduction (SDR) using distance covariance (DCOV) was recently proposed as an approach to dimension-reduction problems. Compared with other SDR methods, it is model-free without estimating link function and does not require any particular distributions on predictors. However, the DCOV-based SDR method involves optimizing a nonsmooth and nonconvex objective function over the Stiefel manifold. To tackle the numerical challenge, the original objective function is equivalently formulated into a DC (Difference of Convex functions) program and an iterative algorithm based on the majorization–minimization (MM) principle is constructed. At each step of the MM algorithm, one iteration of Riemannian Newton’s method is taken to solve the quadratic subproblem on the Stiefel manifold inexactly. In addition, the algorithm can also be readily extended to sufficient variable selection (SVS) using distance covariance. Finally, the convergence property of the proposed algorithm under some regularity conditions is established. Simulation and real data analysis show our algorithm drastically improves the computation efficiency and is robust across various settings compared with the existing method. Matlab codes implementing our methods and scripts for regenerating the numerical results are available at https://github.com/runxiong-wu/MMRN.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.