Abstract
For the training process of federated linear regression (FLR), which is the simplest form of federated learning, the integrated computation at each company is slowed down either by huge volume data or by time-consuming homomorphic encryption. Targetted at accelerating the training process of FLR, through the incorporation of edge computing aided coded distributed computing (CDC) into intensive computation (matrix multiplication), a novel coded FLR framework is proposed where several edge nodes aid the computing of one company. Two schemes, including linear combination (LC)-based vertical FLR and Matdot-based vertical FLR, are proposed and designed, which enjoy in-parallel computation and homomorphic encryption at the edge nodes. Since workload at each edge node is reduced significantly, the training runtime of these two schemes may be reduced significantly. Numerical studies show that our proposed coded schemes outperform traditional uncoded schemes significantly in terms of overall runtime (sum of encoding, computing, and decoding phases) of the training process. Besides, among the two proposed coded schemes, LC-based scheme and Matdot-based scheme each has its own advantage scenarios which conforms with the analysis.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have