Abstract
It is well-known that clustering partitions network into logical groups of nodes in order to achieve energy efficiency and to enhance dynamic channel access in cognitive radio through cooperative sensing. While the topic of energy efficiency has been well investigated in conventional wireless sensor networks, the latter has not been extensively explored. In this paper, we propose a reinforcement learning-based spectrum-aware clustering algorithm that allows a member node to learn the energy and cooperative sensing costs for neighboring clusters to achieve an optimal solution. Each member node selects an optimal cluster that satisfies pairwise constraints, minimizes network energy consumption and enhances channel sensing performance through an exploration technique. We first model the network energy consumption and then determine the optimal number of clusters for the network. The problem of selecting an optimal cluster is formulated as a Markov Decision Process (MDP) in the algorithm and the obtained simulation results show convergence, learning and adaptability of the algorithm to dynamic environment towards achieving an optimal solution. Performance comparisons of our algorithm with the Groupwise Spectrum Aware (GWSA)-based algorithm in terms of Sum of Square Error (SSE), complexity, network energy consumption and probability of detection indicate improved performance from the proposed approach. The results further reveal that an energy savings of 9% and a significant Primary User (PU) detection improvement can be achieved with the proposed approach.
Highlights
Technological advances in microelectronics have led to the widespread applications of wireless sensor networks (WSNs) in a variety of application areas
We propose an Energy Efficient Spectrum Aware clustering algorithm based on Reinforcement Learning (EESA-RLC) to enhance spectrum hole detection and minimize network energy consumption in Cognitive Radio (CR)-WSNs
We propose a novel spectrum-aware clustering algorithm based on reinforcement learning to minimize network energy consumption and enhance channel sensing in cognitive radio sensor networks
Summary
Technological advances in microelectronics have led to the widespread applications of wireless sensor networks (WSNs) in a variety of application areas. We propose an Energy Efficient Spectrum Aware clustering algorithm based on Reinforcement Learning (EESA-RLC) to enhance spectrum hole detection and minimize network energy consumption in CR-WSNs. Reinforcement learning is a machine learning technique that allows an agent to interact with its operating environment and learn an optimal policy that maximizes cumulative rewards [27]. The agent, which in this case is the SU, detects vacant licensed channels through channel sensing, imposes pairwise constraints to select a clusterhead among the neighboring clusterheads, cooperates with other member nodes in the cluster to determine channel availability, and chooses an optimal policy that enhances spectrum hole detection and minimizes network energy consumption. We propose a novel energy-efficient clustering algorithm that is aware of the dynamic radio environment and allows member nodes to learn an optimal policy for choosing optimal clusters based on local decision accuracy and energy consumption for cooperative sensing and data communication. The algorithm eliminates network instability due to re-clustering when the SUs detect PUs’ arrival
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.