Abstract

Distributed estimation has attracted a significant amount of attention recently due to its advantages in computational efficiency and data privacy preservation. In this article, we focus on quantile regression over a decentralized network. Without a coordinating central node, a decentralized network improves system stability and increases efficiency by communicating with fewer nodes per round. However, existing related works on decentralized quantile regression have slow (sub-linear) convergence speed. We propose a novel method for decentralized quantile regression which is built upon the smoothed quantile loss. However, we argue that the smoothed loss proposed in the existing literature using a single smoothing bandwidth parameter fails to achieve fast convergence and statistical efficiency simultaneously in the decentralized setting. We propose a novel quadratic approximation of the quantile loss using a big bandwidth for the Hessian and a small bandwidth for the gradient. Our method enjoys a linear convergence rate and has optimal statistical efficiency. Numerical experiments and real data analysis are conducted to demonstrate the effectiveness of our method.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.