Abstract

Embedded devices are gaining popularity day by day due to the expanded use of Internet of Things applications. However, these embedded devices have limited capabilities concerning power and memory. Thus, the applications need to be tailored in such a way to perform the specified tasks within the constrained resources with the same accuracy. In Real-Time task scheduling, one of the challenging factors is the intelligent modelling of input tasks in such a way that it produces not only logically correct output within the deadline but also consumes minimum CPU power. Algorithms like Rate Monotonic and Earliest Deadline First compute hyper-period of input tasks for periodic repetition of the same set of tasks on CPU. However, at times when the tasks are not adequately modelled, they lead to an enormously high value of hyper-period which result in more CPU cycles and power consumption. Many state-of-the-art solutions are presented in this regard, but the main problem is that they limit tasks from having all possible period values; however, with the vision of Industry 4.0, where most of the tasks will be doing some critical manufacturing activities, it is highly discouraged to prevent them of a certain period. In this paper, we present a resource-aware approach to minimise the hyper-period of input tasks based on device profiles and allows tasks of every possible period value to admit. The proposed work is compared with similar existing techniques, and results indicate significant improvements regarding power consumptions.

Highlights

  • With the rapid revolution of organisations from the physical space to a cyber world [1], the concept of the Internet of Things (IoT) is earning further acclaim as the cyber world demands interconnecting things via IoT [2,3,4]

  • We discussed a novel approach to deal with the challenge of power consumption issues arises due to the high value of hyper-period in real-time IoT systems

  • We proposed an adaptive strategy for finding the optimal threshold

Read more

Summary

Introduction

With the rapid revolution of organisations from the physical space to a cyber world [1], the concept of the Internet of Things (IoT) is earning further acclaim as the cyber world demands interconnecting things via IoT [2,3,4]. Research in the scheduling area for embedded systems can be leveraged in the context of IoT applications, but most of the algorithms that run well on general purpose systems do not scale well on these embedded devices because of their limited capabilities in terms of power and energy management This issue becomes even more challenging for the periodic real-time tasks. This approach allows every possible value of the period within a specific bound unlike the traditional approach of harmonization This method is general in the sense that it covers general priority-driven systems including both fixed-priority systems and dynamic-priority systems and all those algorithms that deal with a hyper-period to schedule input tasks on CPU.

Related Work
RT-IoT System Model
Proposed Work
Objective
Optimal Threshold
Complexity Analysis
Interaction Model
Implementation Setup
Experimental Results
Input Tasks Load
Period Range
Execution and Arrival Time Range
CPU Utilization Bound
Tasks Acceptance Ratio
Periods’ Histogram
Threshold
Performance Measure on Raspberry Pi
Conclusions and Future Directions

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.