Abstract

Modal regression is a good alternative of the mean regression and likelihood based methods, because of its robustness and high efficiency. A robust communication-efficient distributed modal regression for the distributed massive data is proposed in this paper. Specifically, the global modal regression objective function is approximated by a surrogate one at the first machine, which relates to the local datasets only through gradients. Then the resulting estimator can be obtained at the first machine and other machines only need to calculate the gradients, which can significantly reduce the communication cost. Under mild conditions, the asymptotical properties are established, which show that the proposed estimator is statistically as efficient as the global modal regression estimator. What is more, as a specific application, a penalized robust communication-efficient distributed modal regression variable selection procedure is developed. Simulation results and real data analysis are also included to validate our method.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call