The rapid advancement of Internet of Things (IoT) networks has revolutionized modern connectivity by integrating many low-power devices into various applications. As IoT networks expand, the demand for energy-efficient, batteryless devices becomes increasingly critical for sustainable future networks. These devices play a pivotal role in next-generation IoT applications by reducing the dependence on conventional batteries and enabling continuous operation through energy harvesting capabilities. However, several challenges hinder the widespread adoption of batteryless IoT devices, including the limited transmission range, constrained energy resources, and low spectral efficiency in IoT receivers. To address these limitations, reconfigurable intelligent surfaces (RISs) offer a promising solution by dynamically manipulating the wireless propagation environment to enhance signal strength and improve energy harvesting capabilities. In this paper, we propose a novel deep reinforcement learning (DRL) algorithm that optimizes the phase shifts of RISs to maximize the network’s achievable rate while satisfying IoT devices’ energy harvesting constraints. Our DRL framework leverages a novel six-dimensional chimp optimization algorithm (6DChOA) to fine-tune the hyper-parameters, ensuring efficient and adaptive learning. The proposed 6DChOA-DRL algorithm optimizes RIS phase shifts to enhance the received power of IoT devices while mitigating interference from direct and RIS-cascaded links. The simulation results demonstrate that our optimized RIS design significantly improves energy harvesting and achievable data rates under various system configurations. Compared to benchmark algorithms, our approach achieves higher gains in harvested power, an improvement in the data rate at a transmit power of 20 dBm, and a significantly lower root mean square error (RMSE) of 0.13 compared to 3.34 for standard RL and 6.91 for the DNN, indicating more precise optimization of RIS phase shifts.
Read full abstract