Distributed estimation has attracted a significant amount of attention recently due to its advantages in computational efficiency and data privacy preservation. In this article, we focus on quantile regression over a decentralized network. Without a coordinating central node, a decentralized network improves system stability and increases efficiency by communicating with fewer nodes per round. However, existing related works on decentralized quantile regression have slow (sub-linear) convergence speed. We propose a novel method for decentralized quantile regression which is built upon the smoothed quantile loss. However, we argue that the smoothed loss proposed in the existing literature using a single smoothing bandwidth parameter fails to achieve fast convergence and statistical efficiency simultaneously in the decentralized setting. We propose a novel quadratic approximation of the quantile loss using a big bandwidth for the Hessian and a small bandwidth for the gradient. Our method enjoys a linear convergence rate and has optimal statistical efficiency. Numerical experiments and real data analysis are conducted to demonstrate the effectiveness of our method.
Read full abstract