Abstract

Decentralized computation offloading and caching in Multi-Access Edge Computing (MEC) is a promising approach to evolve the forthcoming network generation. MEC is the emerging technology that provides adaptive micro cloud services to the edge of proximity resource-constrained smart communication and Internet of Everything (IoE) devices for cellular subscribers. Nowadays, Massive IoE devices are exponentially connected to the global ecosystem. As a result, the backhaul network traffic grows enormously and users’ ultra-reliable low latency communications are challenging as well. In this paper, we explored decentralized adaptive resource-aware communication, computing, & caching framework which can orchestrate the dynamic network environments based on Deep Reinforcement Learning (DRL). Subsequently, the framework can perform augmented decision-making capabilities to enhance users’ connectivity and resource utilization requirements. Basically, every IoE device user are attempting to capitalize their own utilities. Hence, the problem is formulated using Non-cooperative game theory which is non-deterministic polynomial to solve the structural property of the MEC networks. We analyze and show that the game admits a Nash Equilibrium. Moreover, we have introduced a decentralized cognitive scheduling algorithm by exploiting DRL technology to leverage the utility of IoE & smart communication devices. Therefore, numerical results and theoretical analysis revealed that the proposed algorithm outperform, ultra-reliable low latency, and scalable than the baseline schemes.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call