In meteorological and electric power Internet of Things scenarios, in order to extend the service life of relevant facilities and reduce the cost of emergency repair, the intelligent inspection swarm is introduced to cooperate with monitoring tasks, which collect and process the current scene data through a variety of sensors and cameras, and complete tasks such as emergency handling and fault inspection. Due to the limitation of computing resources and battery life of patrol inspection equipment, it will cause problems such as slow response in emergency and long time for fault location. Mobile Edge Computing is a promising technology, which can improve the quality of service of the swarm by offloading the computing task of the inspection equipment to the edge server nearby the network. In this paper, we study the problem of computing offloading of multi-devices multi-tasks and multi-servers in the intelligent patrol inspection swarm under the condition of a dynamic network environment and limited resources of servers and inspection equipment. An effective adaptive learning offloading strategy based on distributed reinforcement learning and multi-classification is proposed to reduce the task processing delay and energy consumption of the intelligent inspection swarm and improve the service quality. Numerical experimental results demonstrate that the proposed strategy is superior to other offloading strategies in terms of time delay, energy consumption and quality of service.
Read full abstract