Abstract

A well-known system-level strategy to reduce the energy consumption of microprocessors or microcontrollers is to organize the scheduling of the executed tasks so that it is aware of the main battery nonidealities. In the Internet-of-Things (IoT) domain, devices rely on simpler microcontrollers, workloads are less rich, and batteries are typically sized to guarantee lifetimes of more extensive orders of magnitude (e.g., days, as opposed to hours). Load current magnitudes in these IoT devices are, therefore, relatively small compared to other more powerful devices, and they hardly trigger the conditions that emphasize the battery nonidealities. In this work, we carry out a measurement-based assessment about whether task scheduling is really relevant to extend the lifetime of IoT devices. We run experiments both on a physical commercial IoT device hosting four sensors, an MCU, and a wireless radio, as well as on a “synthetic” device emulated with a programmable load generator. We used both secondary lithium-ion and primary alkaline batteries to explore the impact of battery chemistries further. Results show that the impact of different schedules is essentially irrelevant, with a maximum difference of only 3.98% in battery lifetime between the optimal and worst schedules.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call