Abstract
In this paper, we propose a novel multitask scheduling and distributed collaborative computing method for quality of service (QoS) guaranteed delay‐sensitive services in the Internet of Things (IoT). First, we propose a multilevel scheduling framework combining the process and thread scheduling for reducing the processing delay of multitype services of a single edge node in IoT, where a preemptive static priority process scheduling algorithm is adopted for different types of services and a dynamic priority‐based thread scheduling algorithm is proposed for the same type of services with high concurrency. Furthermore, for reducing the processing delay of computation‐intensive services, we propose a distributed task offloading algorithm based on a multiple 0‐1 knapsack model with value limitation with the collaboration of multiple edge nodes to minimize the processing delay. Simulation results show that the proposed method can significantly reduce not only the scheduling delay of a large number of time‐sensitive services in single edge node but also the process delay of computation‐intensive service collaborated by multiple edge nodes.
Highlights
With the rapid development of the Internet of Things (IoT) applications, the number of devices connected to the network has increased dramatically, the volumes of data have an explosive growth, and there have been higher computing and storage requirements on the centralized approach of task processing, as represented by cloud computing, whereas edge computing is a distributed architecture that features decentralized processing and storage power, where the central node and the edge node collaborate to complete the computing process, dispersing the storage and computation tasks of the central node to the edge of the network, making full use of the equipment resources and reducing the pressure of computing and storage on the central node
They are becoming increasingly popular for IoT applications, such as the Internet of Vehicles (IoV) scenario, where many edge nodes (ENs) for data processing are deployed on both sides of the highway in order to minimize the processing delay of data
This paper first designs a multilevel scheduling framework that combines process scheduling and thread scheduling and proposes a preemptive static priority scheduling algorithm and a thread scheduling algorithm based on dynamic priority
Summary
With the rapid development of the IoT applications, the number of devices connected to the network has increased dramatically, the volumes of data have an explosive growth, and there have been higher computing and storage requirements on the centralized approach of task processing, as represented by cloud computing, whereas edge computing is a distributed architecture that features decentralized processing and storage power, where the central node and the edge node collaborate to complete the computing process, dispersing the storage and computation tasks of the central node to the edge of the network, making full use of the equipment resources and reducing the pressure of computing and storage on the central node. The process scheduling uses a preemptive static priority scheduling algorithm for different kinds of services, and the tread scheduling is based on dynamic priority for the same kind of service with high concurrency This algorithm is based on the task execution urgency factor which is determined by the remaining computation within the task deadline (2) A distributed collaborative computing method is proposed in the MEN scenario to the problem of how to reduce the processing delay of computation-intensive tasks in IoT. This method first designs a distributed collaborative computing framework based on the asynchronous message queue which enables multiple nodes to serve the same task collaboratively through splitting the task. A task offloading decision algorithm based on a multiple 0-1 knapsack model with value limitation is designed to minimize the task processing delay with the optimal offloading scheme
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.