Abstract

Mobile edge computing is an emerging research topic which aims at pushing the computation from the cloud to the edge devices. Most of the current machine learning (ML) algorithms, such as federated learning, are designed for homogeneous mobile networks, that is, all the devices collect the same type of data. In this paper, we address distributed training of ML algorithms in heterogeneous mobile networks where the features, rather than the samples, are distributed across multiple heterogeneous mobile devices. Training ML models in heterogeneous mobile networks incurs a large communication cost due to the necessity to deliver the local data to a central server. Inspired by active learning, which is traditionally used to reduce the labeling cost for training ML models, we propose an active sampling method to reduce the communication cost of learning in heterogeneous mobile networks. Instead of sending all the local data, the proposed active sampling method identifies and sends only informative data from each device to the central server. Extensive experiments on four real datasets, both with numerical simulation and on a networked mobile system, show that the proposed method can reduce the communication cost by up to 53% and energy consumption by up to 67% without accuracy degradation compared with the conventional approaches.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call