Abstract

With the rapid development of Internet-of-Things (IoT) devices and mobile communication technologies, Multi-access Edge Computing (MEC) has emerged as a promising paradigm to extend cloud computing and storage capabilities to the edge of cellular networks, near to IoT devices. MEC enables IoT devices with limited battery capacity and computation/storage capabilities to execute their computation-intensive and latency-sensitive applications at the edge of the networks. However, to efficiently execute these applications in MEC systems, each task must be properly offloaded and scheduled onto the MEC servers. Additionally, the MEC servers may intelligently balance and share their computing resources to satisfy the application QoS and QoE. Therefore, effective resource allocation (RA) mechanisms in MEC are vital for ensuring its foreseen advantages. Recently, Machine Learning (ML) and Deep Learning (DL) have emerged as key methods for many challenging aspects of MEC. Particularly, ML and DL play a crucial role in addressing the challenges of RA in MEC. This paper presents a comprehensive survey of ML/DL-based RA mechanisms in MEC. We first present tutorials that demonstrate the advantages of applying ML and DL in MEC. Then, we present enabling technologies for quickly running ML/DL training and inference in MEC. Afterward, we provide an in-depth survey of recent works that used ML/DL methods for RA in MEC from three aspects: (1) ML/DL-based methods for task offloading; (2) ML/DL-based methods for task scheduling; and (3) ML/DL-based methods for joint resource allocation. Finally, we discuss key challenges and future research directions of applying ML/DL for resource allocation in MEC networks.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call