Routing protocols, as a crucial component of the internet of things (IoT), play a significant role in data collection and environmental monitoring tasks. However, existing clustering routing protocols suffer from issues such as uneven network energy consumption, high communication delays, and inadequate adaptation to topology changes. To address these issues, this study proposes an adaptive routing algorithm to balance energy consumption and delay using game theory and deep Q-network (DQN) algorithms (EDRP-GTDQN). Specifically, EDRP-GTDQN evaluates the importance of node positions using node centrality and integrates a game-theoretic-based approach to select optimal cluster heads in terms of node centrality and residual energy. Moreover, graph convolutional networks (GCN) and DQN are incorporated to construct transmission paths for cluster heads, adapt to network topology changes, and balance energy consumption and performance. Furthermore, a cluster rotation mechanism is employed to optimize overall network energy consumption and prevent the formation of hotspots. Experimental results demonstrate that EDRP-GTDQN achieves average performance improvements of 19.76%, 30.04%, 44.2%, and 61.42% in average energy consumption, network lifetime, and average end-to-end delay compared to conventional routing protocols such as EECRAIFA, MRP-GTCO, DEEC, and MH-LEACH. Therefore, EDRP-GTDQN is undoubtedly an effective solution to reduce energy consumption and enhance service quality in wireless sensor networks.
Read full abstract