Abstract

Federated Learning(FL) has become a research hot spot in recent years because of its advantages in privacy protection. With FL, user data no longer needs to be transferred to a data center, but only trained on edge devices. Instead, the parameters of the trained model are sent to the server. FL can also use the computing power of each device to protect data from being compromised. On the other hand, in the mobile edge computing scenario, there is a large amount of data, smart phones, wearable devices and other IOT devices can not collect all the data at the beginning, the device has to waste resources and time until all the data is collected to train its local model. Furthermore, in some scenarios, the model needs to be deployed in a short period of time. Thus, incremental learning (IL) is needed, which allows continuous training with continuously collected data. In this paper, we first design a new birth-epoch computing scheme and a new batch-oriented aggregation method. Then, around these, we reformulated the Federated Incremental Learning (FIL) framework which combines FL and IL in the IoT scenario. In addition, we propose an algorithm called Latest-Compensation Federated Incremental Learning (LCFIL) that emphasizes new arrivals by adjusting the loss of each batch of data. A subtle loss function is designed to improve the convergence speed and learning performance. Extensive experiments have shown that LCFIL is effective in different data or model or data distribution Settings. Index Terms-Federated Learning, Internet of Things, Incremental Learning

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call