Abstract

This study considers a constrained huge-scale optimization problem over networks where the objective is to minimize the sum of nonsmooth local loss functions. To solve this problem, many optimization algorithms have been proposed by employing (sub)gradient descent methods to handle high-dimensional data, but the computation of the entire sub(gradient) becomes a computational bottleneck. To reduce the computational burden of each agent and preserve privacy of data in time-varying networks, we propose a privacy-preserving decentralized randomized block-coordinate subgradient projection algorithm over time-varying networks, in which the coordinates of the subgradient vector is randomly chosen to update the optimized parameter and the partially homomorphic cryptography is used to protect the privacy of data. Further, we prove that our algorithm is convergent asymptotically. Moreover, the rates of convergence are also established by choosing appropriate step sizes, i.e., O(logK/K) under local strong convexity and O(logK/K) under local convexity, in which K represents the number of iterations. Meanwhile, we show that the privacy of data can be protected by the proposed algorithm. The results of experiments demonstrate the computational benefit of our algorithm on two real-world datasets. The theoretical results are also verified by different experiments.

Full Text

Published Version
Open DOI Link

Get access to 250M+ research papers

Discover from 40M+ Open access, 3M+ Pre-prints, 9.5M Topics and 32K+ Journals.

Sign Up Now! It's FREE

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call