Abstract

Wireless backhaul is considered a portable and cost-effective solution for deploying small cell-assisted communications in mobile Internet of Everything (IoE) networks. In this system, the wireless backhaul between a central gNodeB (gNB) and small cell base stations (SBSs) shares the frequency spectrum with the wireless fronthaul between the SBSs and user devices. In these circumstances, efficient edge computing deployment requires joint optimization of wireless resource harmonization and partial offloading scheduling to accommodate heterogeneous IoE services. Therefore, in this study, we formulate the system model as a Markov decision process (MDP) to minimize the weighted sum of computation overheads in terms of the latency and energy costs of all user equipment (UE) where network dynamics are considered in the system state. Consequently, an actor–critic reinforcement learning algorithm, i.e., a deep deterministic policy gradient (DDPG)-based algorithm with replay memory technique, is proposed to realize an adaptive offloading decision scheme and optimal wireless resource harmonization between the backhaul and fronthaul. Extensive simulation results reveal that the proposed algorithm reliably converges and provides approximately 70%, 55%, 36%, and 11% lower total computation overhead than UE execution, random, MEC execution, and DQN-based schemes, respectively.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call