Abstract

AbstractIn the era of sixth generation mobile networks (6G), industrial big data is rapidly generated due to the increasing data‐driven applications in the Industrial Internet of Things (IIoT). Effectively processing such data, for example, knowledge learning, on resource‐limited IIoT devices becomes a challenge. To this end, we introduce a cloud‐edge‐end collaboration architecture, in which computing, communication, and storage resources are flexibly coordinated to alleviate the issue of resource constraints. To achieve better performance in hyper‐connected experience, real‐time communication, and sustainable computing, we construct a novel architecture combining digital twin (DT)‐IIoT with edge networks. In addition, considering the energy consumption and delay issues in distributed learning, we propose a deep reinforcement learning‐based method called deep deterministic policy gradient with double actors and double critics (D4PG) to manage the multi‐dimensional resources, that is, CPU cycles, DT models, and communication bandwidths, enhancing the exploration ability and improving the inaccurate value estimation of agents in continuous action spaces. In addition, we introduce a synchronization threshold for distributed learning framework to avoid the synchronization latency caused by stragglers. Extensive experimental results prove that the proposed architecture can efficiently conduct knowledge learning, and the intelligent scheme can also improve system efficiency by managing multi‐dimensional resources.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call