Abstract

Edge–cloud collaboration is critical in the Industrial Internet of Things (IIoT) for serving computation-intensive tasks (e.g., bearing fault monitoring) that require low-response delay, low energy consumption, and high processing accuracy. In this article, an energy-efficient resource management framework for IIoT with closed-loop control on end devices, edge servers, and cloud center is studied. In the considered model, each edge server aggregates the data collected by industrial sensors (i.e., end devices) and forms computation tasks for corresponding data analysis. In order to minimize the system-wide energy consumption, while maintaining a guaranteed service delay and a satisfied data processing accuracy for each IIoT application, a joint optimization of: 1) sensors’ sampling rate adaption; 2) edge servers’ preprocessing mode selection; and 3) edge–cloud communication and computing resource allocation is formulated. Further taking into account the time-varying channel conditions and randomness of data arrivals, we propose a low-complexity online algorithm, which solves the problem in a dynamic manner. Particularly, the Lyapunov optimization method is first utilized to decompose the long-term problem into a series of instant ones [mixed-integer nonlinear programming (MINLP) problems], and then a Markov approximation algorithm is applied to solve such instant problems to near optimum with the consideration of future impacts. Performance analyses and simulation results show that the proposed algorithm is feasible under long-term service satisfaction constraints, and its energy consumption and service delay are approximately 20% and 28% lower than those of the benchmark schemes, respectively.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call