Abstract

IoT-Edge-Fog Computing presents a trio-logical model for decentralized computing in a time-sensitive manner. However, to address the rising need for real-time information processing and decision modeling, task allocation among dispersed Edge Computing nodes has been a major challenge. State-of-the-art task allocation techniques such as Min–Max, Minimum Completion time, and Round Robin perform task allocation, butv several limitations persist including large energy consumption, delay, and error rate. Henceforth, the current work provides a Quantum Computing-inspired optimization technique for efficient task allocation in an Edge Computing environment for real-time IoT applications. Furthermore, the QC-Neural Network Model is employed for predicting optimal computing nodes for delivering real-time services. To acquire the performance enhancement, simulations were performed by employing 6, 10, 14, and 20 Edge nodes at different times to schedule more than 600 heterogeneous tasks. Empirical results show that an average improvement of 5.02% was registered for prediction efficiency. Similarly, the error reduction of 2.03% was acquired in comparison to state-of-the-art techniques.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call