The computational power of Internet of Things (IoT) devices is usually low, which makes it necessary to process data and extract relevant information on devices with higher processing capacity. Edge Computing emerged as a complementary solution to cloud computing, providing devices at the network edge with computational resources to handle the data processing and analysis that constrained IoT devices eventually cannot perform. This solution allows data processing closer to the IoT devices, reducing latency for IoT applications. However, the resource constraints of edge nodes, which have lower computational power than the cloud nodes, make resource allocation and processing massive requests challenging. This study proposes an edge resource allocation mechanism based on task priority and machine learning. The proposed approach efficiently allocates resources for IoT requests based on their task priorities while monitoring the resource consumption of edge nodes. This study evaluates the performance of different classification algorithms by using well-known metrics for classifying models. The most efficient classifier achieved an accuracy of 92% and a precision of 90%. The results indicate good performance when using this classifier in the evaluated approach. The proposed mechanism demonstrated that resource management can be done more efficiently with significantly lower resource utilization when compared to an allocation method based only on distance. The study tested different scenarios regarding the number of requests, edge nodes, and a proposed failure mechanism to schedule failed node tasks to functional nodes. This failure control mechanism is a significant contribution of the proposal. Therefore, the proposed method in this study can become a valuable tool for efficient resource management with reduced computational cost and efficient resource allocation.
Read full abstract