Abstract

The growing size of modern data brings challenges to statistical learning, and substantial distributed algorithms have been proposed. However, most of them need the homogeneity assumption that the distribution of the local data is the same as that of the global data. This is seldom in practice, and the learning performance deteriorates seriously if this assumption is not satisfied. Moreover, they are only for independent data, and cannot incorporate the serial correlations between data. To solve these issues, we propose a novel distributed statistical learning framework for the nonlinear regression with autoregressive errors, which realizes communication-efficient distributed optimization, and overcomes the homogeneity assumption. The theoretical results also guarantee that the new distributed framework is equivalent to the global one. Numerical experiments also illustrate the good performance of the new method.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call