Sort by
Development of novel intrusion detection in Internet of Things using improved dart game optimizer‐derived optimal cascaded ensemble learning

AbstractBackground of the StudyInternet of things (IoT) industry has accelerated its development with the support of advanced information technology and economic expansion. A complete industrial foundation includes software, chips, electronic components, IoT services, integrated systems, machinery, and telecom operators, which the gradual improvement in the IoT industry system has formulated. As the exponential growth of IoT devices increases, the attack surface available to cybercriminals enables them to carry out potentially more damaging operations. As a result, the security sector has witnessed a rise in cyberattacks. Hackers use several methods to copy and modify the information in the IoT environment. Machine learning techniques are used by the intrusion detection (ID) model to determine and categorize attacks in IoT networks.ObjectivesThus, this study explores the ID system with the heuristic‐assisted deep learning approaches for effectively detect the attacks in the IoT. At first, the IoT data are garnered in benchmark resources. Then, the gathered data is preprocessed to perform data cleaning. Next, the data is transformed and fed to the feature extraction stage. The feature extraction is performed with the help of one‐dimensional convolutional neural network (1D‐CNN), where the features are extracted from the target‐based pooling layer. Then, these attained deep features are fed to the ID phase, where the cascaded ensemble learning (CEL) approach is adopted for detecting the intrusions. Here, the hyperparameter tuning is done with a new suggested improved darts game optimizer (IDGO) algorithm. Here, the main objective of the developed algorithm helps to maximize accuracy in ID.FindingsThroughout the experimental findings, the developed model provides 86% of accuracy. Thus, the finding of the developed model shows less detecting time and higher detection efficiency.

Relevant
Multi‐graph representation spatio‐temporal attention networks for traffic forecasting in the cinematic metaverse

AbstractThe cinematic metaverse aims to create a virtual space with the context of a film. Users can enter this space in the form of avatars, experiencing the cinematic plot firsthand in an immersive manner. This requires us to design a rational computation resource allocation and synchronization algorithm to meet the demands of multi‐objective joint optimization, such as low latency and high throughput, which ensures that users can seamlessly switch between virtual and real worlds and acquire immersive experiences. Unfortunately, the explosive growth in the number of users makes it difficult to jointly optimize multiple objectives. Predicting traffic generated by the users' avatars in the cinematic metaverse is significant for the optimization process. Although graph neural networks‐based traffic prediction models achieve superior prediction accuracy, these methods rely only on physical distances‐based topological graph information, while failing to comprehensively reflect the real relationships between avatars in the cinematic metaverse. To address this issue, we present a novel Multi‐Graph Representation Spatio‐Temporal Attention Networks (MGRSTANet) for traffic prediction in the cinematic metaverse. Specifically, based on multiple topological graph information (e.g., physical distances, centerity, and similarity), we first design Multi‐Graph Embedding (MGE) module to generate multiple graph representations, thus reflecting on the real relationships between avatars more comprehensively. The Spatio‐Temporal Attention (STAtt) module is then proposed to extract spatio‐temporal correlations in each graph representations, thus improving prediction accuracy. We conduct simulation experiments to evaluate the effectiveness of MGRSTANet. The experimental results demonstrate that our proposed model outperforms the state‐of‐the‐art baselines in terms of prediction accuracy, making it appropriate for traffic forecasting in the cinematic metaverse.

Relevant
GIJA:Enhanced geyser‐inspired Jaya algorithm for task scheduling optimization in cloud computing

AbstractTask scheduling optimization plays a pivotal role in enhancing the efficiency and performance of cloud computing systems. In this article, we introduce GIJA (Geyser‐inspired Jaya Algorithm), a novel optimization approach tailored for task scheduling in cloud computing environments. GIJA integrates the principles of the Geyser‐inspired algorithm with the Jaya algorithm, augmented by a Levy Flight mechanism, to address the complexities of task scheduling optimization. The motivation for this research stems from the increasing demand for efficient resource utilization and task management in cloud computing, driven by the proliferation of Internet of Things (IoT) devices and the growing reliance on cloud‐based services. Traditional task scheduling algorithms often face challenges in handling dynamic workloads, heterogeneous resources, and varying performance objectives, necessitating innovative optimization techniques. GIJA leverages the eruptive dynamics of geysers, inspired by nature's efficiency in channeling resources, to guide task scheduling decisions. By combining this Geyser‐inspired approach with the simplicity and effectiveness of the Jaya algorithm, GIJA offers a robust optimization framework capable of adapting to diverse cloud computing environments. Additionally, the integration of the Levy Flight mechanism introduces stochasticity into the optimization process, enabling the exploration of solution spaces and accelerating convergence. To evaluate the efficacy of GIJA, extensive experiments are conducted using synthetic and real‐world datasets representative of cloud computing workloads. Comparative analyses against existing task scheduling algorithms, including AOA, RSA, DMOA, PDOA, LPO, SCO, GIA, and GIAA, demonstrate the superior performance of GIJA in terms of solution quality, convergence rate, diversity, and robustness. The findings of GIJA provide a promising solution quality for addressing the complexities of task scheduling in cloud environments (95%), with implications for enhancing system performance, scalability, and resource utilization.

Relevant
Quantum‐safe Lattice‐based mutual authentication and key‐exchange scheme for the smart grid

AbstractThe smart grid network (SGN) is expected to leverage advances in the Internet of Things (IoT) to enable effective delivery and monitoring of energy. By integrating communication, computing, and information tools like smart sensors and meters to facilitate the process of monitoring, predictions, and management of power usage, the SGN can improve competence of power‐grid architecture. However, the effective deployment of IoT‐powered SGNs hinges on the deployment of strong security protocols. With the advent of quantum computers, classic cryptographic algorithms based on integer factorization and the Diffie‐Hellman assumptions may not be suitable to secure the sensitive data of SGNs. Therefore, in this paper, a secure quantum‐safe mutual authentication and key‐exchange (MAKe) mechanism is proposed for SGNs, that make use of the hard assumptions of small integer solution and inhomogeneous small integer solution problems of lattice. The proposed protocol is intended to offer confidentiality, anonymity, and hashed‐based mutual authentication with a key‐exchange agreement. Similarly, this scheme allows creation and validation of the mutual trust among the smart‐meters (SMs) and neighbourhood‐area network gateway over an insecure wireless channel. A random oracle model is then used to perform the formal security analysis of the proposed approach. A thorough formal analysis demonstrates proposed algorithm's ability to withstand various known attacks. The performance analysis shows that the proposed approach outperforms other comparative schemes with respect to at least 22.07% of minimal energy utilization, 51.48% effective storage and communications costs, as well as 76.28% computational costs, and thus suitable for resource‐constrained SGNs.

Relevant
Less sample‐cooperative spectrum sensing against large‐scale Byzantine attack in cognitive wireless sensor networks

AbstractCooperative spectrum sensing (CSS) has emerged as a promising strategy for identifying available spectrum resources by leveraging spatially distributed sensors in cognitive wireless sensor networks (CWSNs). Nevertheless, this open collaborative approach is susceptible to security threats posed by malicious sensors, specifically Byzantine attack, which can significantly undermine CSS accuracy. Moreover, in extensive CWSNs, the CSS process imposes substantial communication overhead on the reporting channel, thereby considerably diminishing cooperative efficiency. To tackle these challenges, this article introduces a refined CSS approach, termed weighted sequential detection (WSD). This method incorporates channel state information to validate the global decision made by the fusion center and assess the trust value of sensors. The trust value based weight is assigned to sensing samples, which are then integrated into a sequential detection framework within a defined time window. This sequential approach prioritizes samples based on descending trust values. Numerical simulation results reveal that the proposed WSD outperforms conventional fusion rules in terms of the error probability, sample size, achievable throughput, and latency, even under varying degrees of Byzantine attack. This innovation signifies a substantial advancement in enhancing the reliability and efficiency of CSS.

Open Access
Relevant
An optimal attention <scp>PLSTM</scp>‐based classification model to enhance the performance of <scp>IoMT</scp> attack detection in healthcare application

AbstractThe Internet of Medical Things (IoMT) has revolutionized the healthcare industry by allowing remote monitoring of patients suffering from chronic diseases. However, security concerns arise due to the potential life‐threatening damage that can be caused by attacks on IoMT devices. To enhance the security of IoMT devices, researchers propose the use of novel artificial intelligence‐based intrusion detection techniques. This article presents a hybrid alex net model and an orthogonal opposition‐based learning Yin‐Yang‐pair optimization (OOYO) optimized attention‐based Peephole long short term memory (PLSTM) model to distinguish between malicious and normal network traffic in the IoMT environment. To improve the scalability of the model in handling the random and dynamic behavior of malicious attacks, the hyper parameters of the PLSTM framework are optimized using the OOYO algorithm. The proposed model is evaluated on different IoT benchmark datasets such as N‐BaIoT and IoT healthcare security. Experimental results demonstrate that the proposed model provides a classification accuracy of 99% and 98% on the healthcare security and N‐BaIoT datasets, respectively. Moreover, the proposed model exhibits high generalization ability for multi‐class classifications and is effective in reducing the false discovery rate. Overall, the proposed model achieves high accuracy, scalability, and generalization ability in identifying malicious traffic, which can help improve the security solution of IoMT devices.

Relevant