Recent advancements in low-power electronics and machine-learning techniques have paved the way for innovative wearable Internet of Things (IoT) devices. However, these devices suffer from limited battery capacity and computational power. Hence, energy harvesting from ambient sources has emerged as a promising solution for powering low-energy wearables. Optimal management of the harvested energy is crucial for achieving energy-neutral operation and eliminating the need for frequent recharging. This task is challenging due to the dynamic nature of harvested energy and battery energy constraints. To tackle this challenge, we propose tinyMAN, a reinforcement learning-based energy management framework for resource-constrained wearable IoT devices. tinyMAN maximizes the target device utilization under battery energy constraints without relying on the harvested energy forecast, making it a prediction-free approach. It achieves up to 17% higher utility while reducing battery constraint violations by 80% compared to prior work. We also introduce tinyMAN-MO, a multi-objective extension of tinyMan for applications with time-varying energy demands. It learns the tradeoff between meeting the application’s energy demand and maintaining the battery energy level. We deployed our framework on a wearable device prototype using TensorFlow Lite for Micro, leveraging its small (less than 120 KB) memory footprint. Evaluations show that tinyMAN-MO operates within 10% of the Pareto-optimal solutions with only 1.98 ms execution time and 23.17 μJ energy consumption overhead.
Read full abstract