An improved quantum-inspired particle swarm optimisation approach to reduce energy consumption in IoT networks

  • Abstract
  • Literature Map
  • Similar Papers
Abstract
Translate article icon Translate Article Star icon
Take notes icon Take Notes

An improved quantum-inspired particle swarm optimisation approach to reduce energy consumption in IoT networks

Similar Papers
  • Research Article
  • Cite Count Icon 17
  • 10.3390/s22103910
Green Communication in Internet of Things: A Hybrid Bio-Inspired Intelligent Approach.
  • May 21, 2022
  • Sensors
  • Manoj Kumar + 6 more

Clustering is a promising technique for optimizing energy consumption in sensor-enabled Internet of Things (IoT) networks. Uneven distribution of cluster heads (CHs) across the network, repeatedly choosing the same IoT nodes as CHs and identifying cluster heads in the communication range of other CHs are the major problems leading to higher energy consumption in IoT networks. In this paper, using fuzzy logic, bio-inspired chicken swarm optimization (CSO) and a genetic algorithm, an optimal cluster formation is presented as a Hybrid Intelligent Optimization Algorithm (HIOA) to minimize overall energy consumption in an IoT network. In HIOA, the key idea for formation of IoT nodes as clusters depends on finding chromosomes having a minimum value fitness function with relevant network parameters. The fitness function includes minimization of inter- and intra-cluster distance to reduce the interface and minimum energy consumption over communication per round. The hierarchical order classification of CSO utilizes the crossover and mutation operation of the genetic approach to increase the population diversity that ultimately solves the uneven distribution of CHs and turnout to be balanced network load. The proposed HIOA algorithm is simulated over MATLAB2019A and its performance over CSO parameters is analyzed, and it is found that the best fitness value of the proposed algorithm HIOA is obtained though setting up the parameters, number of rooster, number of hen’s and swarm updating frequency . Further, comparative results proved that HIOA is more effective than traditional bio-inspired algorithms in terms of node death percentage, average residual energy and network lifetime by 12%, 19% and 23%.

  • Conference Article
  • 10.1109/gucon50781.2021.9573937
Fuzzy K-means clustering (FKmC) to maximize the Energy Efficiency in Sensor-Enabled Internet of Things
  • Sep 24, 2021
  • Pankaj Kashyap + 2 more

With recent growth of smart devices, wireless green communication is promising solution to improve the network performance, stability and robustness in the Internet of Things (IoT) networks. Clustering is one the best candidate for network partition which enables energy efficient mechanism for data gathering and transmission from sensor enabled IoT nodes, which also improve the quality of service underlying heterogeneous systems in IoT networks. Mostly, researchers proposed clustering approach in the regard of belongings of nodes to cluster or not. In real scenario, nodes has fuzzy relationship with clsuters, in this regard, we present Fuzzy K-means clustering (FKmC) algorithm to form balanced clustering, which divide the network into disjoint clusters focuses on efficient energy consumption in the IoT network. The work proposed soft clustering (FKmC) for better clusters using the membership parameter of sensor nodes and residual energy of sensor nodes. Simulation work is done in three phases; first phase show the effective formation of cluster between proposed FKmc and state-of-art algorithms, Second and third phase of the simulation are performed and found that FKmC outperforms in terms of network lifetime and energy consumption.

  • Research Article
  • 10.53759/7669/jmc202505038
Blockchain-Enabled Security Enhancement for IoT Networks: Integrating LEACH Algorithm and Distributed Ledger Technology
  • Jan 5, 2025
  • Journal of Machine and Computing
  • Taeyeon Oh

The rapid proliferation of Internet of Things (IoT) networks has significantly advanced various sectors such as smart cities, healthcare, and industrial automation, but it has also introduced substantial security challenges. Protecting data integrity, confidentiality, and availability in these networks is critical, yet traditional security measures often fall short due to the decentralized and resource-constrained nature of IoT devices. The Low-Energy Adaptive Clustering Hierarchy (LEACH) protocol, designed to optimize energy consumption in sensor networks, lacks intrinsic security features. To address these challenges, this paper proposes a novel approach that integrates LEACH with Distributed Ledger Technology (DLT), specifically blockchain. Blockchain’s decentralized and immutable ledger can enhance data security and integrity within IoT networks. The methodology involves modifying LEACH to incorporate blockchain for secure data transmission. In the clustering phase, LEACH forms clusters and designates a cluster head (CH) for data aggregation and transmission. Each CH maintains a local blockchain to log and verify data transactions within its cluster, using a consensus mechanism to ensure data integrity. Smart contracts are implemented to automate security policies and detect anomalies, while data encryption and digital signatures provide additional security layers. Simulations using the NS-3 simulator showed promising results: energy consumption was reduced by 18% compared to traditional LEACH, latency increased by 5% due to blockchain processing overhead, throughput improved by 12%, and security metrics indicated a 25% improvement in data integrity and a 30% reduction in successful attack attempts. In conclusion, integrating the LEACH algorithm with blockchain significantly enhances the security and efficiency of IoT networks. This approach leverages the energy optimization of LEACH and the robust security framework of blockchain, offering a scalable and secure solution for diverse IoT applications. Future research will focus on optimizing blockchain operations to reduce latency further and exploring the model's applicability in various IoT scenarios.

  • Research Article
  • Cite Count Icon 112
  • 10.1007/s40860-020-00126-x
A smart anomaly-based intrusion detection system for the Internet of Things (IoT) network using GWO–PSO–RF model
  • Jan 18, 2021
  • Journal of Reliable Intelligent Environments
  • Pankaj Kumar Keserwani + 3 more

The Internet of Things (IoT) is adding the advancement in the technology for creating smart environments to facilitate humans for various works. The technological developments provide many comfort and opportunities to the businesses and open the doors for the intruders or attackers to explore and exploit various attacks to evade the IoT networks’ security. Hence, security and privacy are the key anxiety to the IoT network model. Protection of computer and IoT networks from various types of attacks and threats is necessary. The traditional intrusion detection system (IDS) collects and uses massive data with unnecessary, irrelevant, and inappropriate features, which cause high detection time and low accuracy. This paper proposes an IDS to identify various attacks for IoT networks. A combination of Grey Wolf Optimization (GWO) and Particle Swarm Optimization (PSO) is used to extract relevant IoT network features. The extracted features are fed to a random forest (RF) classifier to achieve high attack detection accuracy. The experiments are conducted in the python programming environment to evaluate the proposed model on KDDCup99, NSL–KDD, and CICIDS-2017 datasets. The proposed GWO–PSO–RF NIDS model has achieved an average accuracy of 99.66% for multiclass classification. The accuracy of the proposed model has been compared with other similar approaches to show its effectiveness. The work presented here also addresses the issue of data imbalance.

  • Research Article
  • Cite Count Icon 14
  • 10.1109/tvt.2022.3199677
A Deep Reinforcement Learning-Based Caching Strategy for IoT Networks With Transient Data
  • Dec 1, 2022
  • IEEE Transactions on Vehicular Technology
  • Hongda Wu + 2 more

The Internet of Things (IoT) has been continuously rising in the past few years, and its potentials are now more apparent. However, transient data generation and limited energy resources are the major bottlenecks of these networks. Besides, minimum delay and other conventional quality of service measurements are still valid requirements to meet. An efficient caching policy can help meet the standard quality of service requirements while bypassing IoT networks' specific limitations. Adopting deep reinforcement learning (DRL) algorithms enables us to develop an effective caching scheme without needing prior knowledge or contextual information. In this work, we propose a DRL-based caching scheme that improves the cache hit rate and reduces energy consumption of the IoT networks, in the meanwhile, taking data freshness and limited lifetime of IoT data into account. To better capture the regional-different popularity distribution, we adopt a hierarchical architecture to deploy edge caching nodes in IoT networks. The results of comprehensive experiments show that our proposed method outperforms the well-known conventional caching policies and an existing DRL-based solution in terms of cache hit rate and energy consumption of the IoT networks by considerable margins.

  • PDF Download Icon
  • Research Article
  • Cite Count Icon 4
  • 10.3390/s18082735
Green Compressive Sampling Reconstruction in IoT Networks
  • Aug 20, 2018
  • Sensors (Basel, Switzerland)
  • Stefania Colonnese + 5 more

In this paper, we address the problem of green Compressed Sensing (CS) reconstruction within Internet of Things (IoT) networks, both in terms of computing architecture and reconstruction algorithms. The approach is novel since, unlike most of the literature dealing with energy efficient gathering of the CS measurements, we focus on the energy efficiency of the signal reconstruction stage given the CS measurements. As a first novel contribution, we present an analysis of the energy consumption within the IoT network under two computing architectures. In the first one, reconstruction takes place within the IoT network and the reconstructed data are encoded and transmitted out of the IoT network; in the second one, all the CS measurements are forwarded to off-network devices for reconstruction and storage, i.e., reconstruction is off-loaded. Our analysis shows that the two architectures significantly differ in terms of consumed energy, and it outlines a theoretically motivated criterion to select a green CS reconstruction computing architecture. Specifically, we present a suitable decision function to determine which architecture outperforms the other in terms of energy efficiency. The presented decision function depends on a few IoT network features, such as the network size, the sink connectivity, and other systems’ parameters. As a second novel contribution, we show how to overcome classical performance comparison of different CS reconstruction algorithms usually carried out w.r.t. the achieved accuracy. Specifically, we consider the consumed energy and analyze the energy vs. accuracy trade-off. The herein presented approach, jointly considering signal processing and IoT network issues, is a relevant contribution for designing green compressive sampling architectures in IoT networks.

  • Research Article
  • Cite Count Icon 18
  • 10.1109/jiot.2019.2907871
Energy Efficient Designs of Ultra-Dense IoT Networks With Nonideal Optical Front-Hauls
  • Oct 1, 2019
  • IEEE Internet of Things Journal
  • Lisu Yu + 2 more

We study the optimum designs of the downlink of user-centric ultra-dense Internet of Things (IoT) networks with fiber-wireless communications (FWCs). A large number of low power radio access points (RAPs) are densely deployed in the network to provide service to spatially distributed IoT physical devices (PDs). The RAPs are connected to a central unit (CU) through optical fiber (OF) front-hauls. Radio-frequency-over-fiber (RFoF) is employed in the optical front-hauls to reduce RAP complexity, cost, and energy consumption. With RFoF front-hauls, wireless signals received by PDs are subject to distortions accumulated through the optical and wireless links, including optical loss, optical chromatic distortion, optical and thermal noises, wireless pathloss, and small scale fading. The optimum designs are performed across the optical and wireless domains with the help of a newly developed model that quantifies the combined effects of the optical and wireless links. One of the main challenges faced by the design of an ultra-dense IoT network is the high energy consumption due to dense RAP deployment. The objective of this paper is to minimize the total energy consumption of the entire IoT network, including both optical and wireless links, by jointly optimize RAP power allocation and RAP-PD association, subject to quality-of-service (QoS) constraints for each PD. We propose a low complexity suboptimum binary forcing gradient search (BFGS) algorithm, which performs a gradient-based search based on the unique structure of the problem. Simulation results show that the optical front-hauls have significant impacts on the performance and design of ultra-dense IoT networks.

  • Research Article
  • 10.1186/s13638-025-02537-x
HSG-AGTO: A hybrid heuristic optimization approach for energy-efficient cluster head selection for green communication in IoT networks
  • Nov 28, 2025
  • EURASIP Journal on Wireless Communications and Networking
  • Asha Aiyappan + 3 more

The environmental impacts that are associated with the interconnected sensor devices in the Internet of Things (IoT) network are minimized with the utilization of green communication. This is because, when the IoT devices tend to grow, the IoT system becomes more prone to issues regarding the utilization of resources, sustainability, and energy consumption within the communication system. Tuning the communication protocols, lowering the IoT device’s energy usage rate, and establishing an effective technique for the transmission of data are performed. This is achieved by the green communication technique, which aids in the establishment of significant energy savings in the IoT network. For devices with limited power sources, the green communication technology offers an extended battery life. Despite the enormous potential of IoT technology, a number of obstacles need to be overcome, including those related to load balancing, security, storage, privacy, energy management, and device heterogeneity. In response to the challenges posed by conventional models, an innovative solution for the selection of Cluster Head (CH) using a hybrid approach is devised here. The energy consumption of sensor nodes is influenced by various factors, such asCH load, temperature, distance, residual energy, the number of alive nodes, delay, and so on. To address these challenges in IoT networks, a hybrid approach that combines the Squid Game Optimizer (SGO) and the Artificial Gorilla Troops Optimizer (AGTO) for the optimal selection of CH is suggested. The implemented approach is named Hybrid Squid Game with Artificial Gorilla Troops Optimizer (HSG-AGTO). This approach aims to optimize the above-mentioned factors and overcome energy consumption challenges in IoT networks. The effectiveness of the model is validated, and the results demonstrate the superior performance of the suggested model.

  • PDF Download Icon
  • Research Article
  • Cite Count Icon 38
  • 10.1186/s13677-020-00166-x
Energy-efficient sensory data gathering based on compressed sensing in IoT networks
  • Apr 5, 2020
  • Journal of Cloud Computing
  • Xinxin Du + 3 more

The Internet of Things (IoT) networks have become the infrastructure to enable the detection and reaction of anomalies in various domains, where an efficient sensory data gathering mechanism is fundamental since IoT nodes are typically constrained in their energy and computational capacities. Besides, anomalies may occur occasionally in most applications, while the majority of time durations may reflect a healthy situation. In this setting, the range, rather than an accurate value of sensory data, should be more interesting to domain applications, and the range is represented in terms of the category of sensory data. To decrease the energy consumption of IoT networks, this paper proposes an energy-efficient sensory data gathering mechanism, where the category of sensory data is processed by adopting the compressed sensing algorithm. The sensory data are forecasted through a data prediction model in the cloud, and sensory data of an IoT node is necessary to be routed to the cloud for the synchronization purpose, only when the category provided by this IoT node is different from the category of the forecasted one in the cloud. Experiments are conducted and evaluation results demonstrate that our approach performs better than state-of-the-art techniques, in terms of the network traffic and energy consumption.

  • Conference Article
  • Cite Count Icon 5
  • 10.1109/icnc57223.2023.10074277
Blockchain-enabled Efficient and Secure Federated Learning in IoT and Edge Computing Networks
  • Feb 20, 2023
  • Ranwa Al Mallah + 2 more

Federated learning (FL) has proven to be a promising solution to enable on-device machine learning over massive data generated by Internet of Things (IoT) devices at the network edge. However, the wide vulnerability space of the IoT network increases the risk of model poisoning attacks carried out by malicious or compromised IoT devices against FL model training. This paper proposes to exploit the use of blockchain technology to perform optimized monitoring of the behavior of IoT devices and select only reliable ones to provide model updates to the global FL model while preserving network performance. We formulate our worker device monitoring problem as an optimization problem and solve it to produce the optimal number of monitoring miners in the blockchain network in order to reduce the latency, bandwidth and energy consumption in the overall IoT network. Our results show that the optimal monitoring solution was able to reduce by 75% the total delay incurred by the IoT devices during training.

  • Conference Article
  • Cite Count Icon 7
  • 10.1109/wcsp.2019.8927946
Deep Reinforcement Learning for Dynamic Access Control with Battery Prediction for Mobile-Edge Computing in Green IoT Networks
  • Oct 1, 2019
  • Lijuan Xu + 3 more

Mobile Edge Computing (MEC) technology has emerged as a promising paradigm to reduce the energy consumption for the resource-constrained and energy-limited Internet of Things (IoT) networks. In this paper, benefiting from energy harvesting technique (EH), we study the dynamic MEC-access control problem for maximizing the long-term average uplink transmission rate whilst minimizing the transmission energy consumption for green IoT networks, in which the IoT device is powered by a rechargeable battery that can harvest energy from the surrounding environments. In particular, this problem is formulated as a Markov decision process with system dynamics unknown. On accounting of the dynamics of the wireless channel state, the energy arrival, and the mobility of the IoT device, a Long Short-Term Memory (LSTM) enhanced Deep Q-Network (DQN) based (LSDQN) access control algorithm is proposed for the IoT network. In the proposed algorithm, the LSTM model is used to predict the battery status for assisting the IoT device to determine the optimal access control decision by DQN with the target of maximizing the average uplink rate whilst minimizing the energy consumption. Finally, extensive simulation results verify the performance of the proposed algorithm.

  • PDF Download Icon
  • Research Article
  • Cite Count Icon 14
  • 10.3390/app131810366
A Lightweight Mitigation Approach against a New Inundation Attack in RPL-Based IoT Networks
  • Sep 16, 2023
  • Applied Sciences
  • Mehdi Rouissat + 3 more

Internet of Things (IoT) networks are being widely deployed for a broad range of critical applications. Without effective security support, such a trend would open the doors to notable security challenges. Due to their inherent constrained characteristics, IoT networks are highly vulnerable to the adverse impacts of a wide scope of IoT attacks. Among these, flooding attacks would cause great damage given the limited computational and energy capacity of IoT devices. However, IETF-standardized IoT routing protocols, such as the IPv6 Routing Protocol for Low Power and Lossy Networks (RPL), have no relevant security-provision mechanism. Different variants of the flooding attack can be easily initiated in RPL networks to exhaust network resources and degrade overall network performance. In this paper, a novel variant referred to as the Destination Information Object Flooding (DIOF) attack is introduced. The DIOF attack involves an internal malicious node disseminating falsified information to instigate excessive transmissions of DIO control messages. The results of the experimental evaluation demonstrated the significant adverse impact of DIOF attacks on control overhead and energy consumption, which increased by more than 500% and 210%, respectively. A reduction of more than 32% in Packet Delivery Ratio (PDR) and an increase of more than 192% in latency were also experienced. These were more evident in cases in which the malicious node was in close proximity to the sink node. To effectively address the DIOF attack, we propose a new lightweight approach based on a collaborative and distributed security scheme referred to as DIOF-Secure RPL (DSRPL). It provides an effective solution, enhancing RPL network resilience against DIOF attacks with only simple in-protocol modifications. As the experimental results indicated, DSRPL guaranteed responsive detection and mitigation of the DIOF attacks in a matter of a few seconds. Compared to RPL attack scenarios, it also succeeded in reducing network overhead and energy consumption by more than 80% while maintaining QoS performance at satisfactory levels.

  • Research Article
  • Cite Count Icon 134
  • 10.1109/tnsm.2020.3035315
A Survey and Future Directions on Clustering: From WSNs to IoT and Modern Networking Paradigms
  • Nov 2, 2020
  • IEEE Transactions on Network and Service Management
  • Amin Shahraki + 3 more

Many Internet of Things (IoT) networks are created as an overlay over traditional ad-hoc networks such as Zigbee. Moreover, IoT networks can resemble ad-hoc networks over networks that support device-to-device (D2D) communication, e.g., D2D-enabled cellular networks and WiFi-Direct. In these <italic xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">ad-hoc</i> types of IoT networks, efficient <italic xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">topology management</i> is a crucial requirement, and in particular in massive scale deployments. Traditionally, <italic xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">clustering</i> has been recognized as a common approach for topology management in ad-hoc networks, e.g., in Wireless Sensor Networks (WSNs). Topology management in WSNs and ad-hoc IoT networks has many design commonalities as both need to transfer data to the destination hop by hop. Thus, WSN clustering techniques can presumably be applied for topology management in ad-hoc IoT networks. This requires a comprehensive study on WSN clustering techniques and investigating their applicability to ad-hoc IoT networks. In this article, we conduct a survey of this field based on the <italic xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">objectives</i> for clustering, such as reducing energy consumption and load balancing, as well as the network properties relevant for efficient clustering in IoT, such as network heterogeneity and mobility. Beyond that, we investigate the advantages and challenges of clustering when IoT is integrated with modern computing and communication technologies such as Blockchain, Fog/Edge computing, and 5G. This survey provides useful insights into research on IoT clustering, allows broader understanding of its design challenges for IoT networks, and sheds light on its future applications in modern technologies integrated with IoT.

  • PDF Download Icon
  • Research Article
  • Cite Count Icon 2
  • 10.1007/s10586-024-04450-2
ARPMEC: an adaptive mobile edge computing-based routing protocol for IoT networks
  • Apr 23, 2024
  • Cluster Computing
  • Miguel Landry Foko Sindjoung + 2 more

The Internet of Things (IoT) networks comes with many challenges, especially in network architecture designs. IoT is populated by several kinds of devices with different characteristics that are autonomously managed. These devices do not have enough resources and they require to process data in real-time. Hence, there is a need to design suitable architectures for IoT networks that are as efficient as possible. Previously, Cloud Computing (CC) seemed to provide a good solution of processing data from IoT networks. Recently, Mobile Edge Computing (MEC) seems to be offering a better solution than CC by ensuring a better Quality of Services (QoS) provisioning. As a result, many MEC solutions have emerged for QoS improvement in IoT networks. These solutions mainly focus on device resource management without considering data routing from an end-user device to another, especially when the latter are mobile and need to communicate with each other. In this paper, we propose to design an adaptive routing protocol for a MEC-based network to manage efficiently, the end-user devices’ energy consumption during data routing. The proposed adaptive Mobile Edge Computing-based protocol consists of two main phases: firstly, we subdivide the network’s objects into clusters by exploiting a link quality prediction algorithm. Secondly, we route the data to their destination adaptively by considering the object’s movement during the routing process. As presented in the simulation results, our protocol outperforms other existing routing protocols for IoT networks in terms of energy consumption. We then propose the use of our solution for data routing in IoT networks that require huge data processing and forwarding.

  • Research Article
  • Cite Count Icon 9
  • 10.1016/j.iot.2022.100622
A new quantum-inspired clustering method for reducing energy consumption in IOT networks
  • Sep 30, 2022
  • Internet of Things
  • Yousra Mahmoudi + 2 more

A new quantum-inspired clustering method for reducing energy consumption in IOT networks

Save Icon
Up Arrow
Open/Close
  • Ask R Discovery Star icon
  • Chat PDF Star icon

AI summaries and top papers from 250M+ research sources.