Adaptation in IoT-Fog Data Transmission: SLR and Future Perspectives on Dynamic Frequency Control
The advancement of Internet of Things (IoT) and fog computing technologies has created significant opportunities for more efficient, faster, and proximity-based data management. However, IoT-Fog systems face considerable challenges related to device heterogeneity, traffic dynamics, and the complexity of network topologies that continuously change. This study conducts a Systematic Literature Review (SLR) of various research works covering dynamic scheduling, routing, context-aware data flow, offloading, and IoT-Fog systems without adaptive mechanisms. The findings indicate that most existing approaches still rely on relatively static topology assumptions, rendering them insufficiently adaptive to real-time changes in network conditions. One area identified as a research gap is dynamic frequency control, an adaptive mechanism capable of dynamically adjusting data transmission intensity based on network conditions. The main conclusion of this study emphasizes the necessity for developing adaptive systems that are topology-agnostic and supported by dynamic frequency control to maintain optimal performance even under significant topology changes. Such systems are anticipated to become a crucial foundation for future IoT-Fog applications, including smart cities, Industry 4.0, and intelligent healthcare services, which demand high reliability and low latency.
- Research Article
- 10.1108/ijicc-06-2024-0253
- Nov 26, 2024
- International Journal of Intelligent Computing and Cybernetics
PurposeThe rapid proliferation of Internet of Things (IoT) devices across various domains has created a demand for real-time computing resources that traditional cloud computing models struggle to meet. Fog computing, which brings computation resources closer to IoT devices, has emerged as a promising solution. An automatic service placement framework is needed to use fog computing resources efficiently.Design/methodology/approachIn this study, first a three-layer independent service framework is introduced to define relationships between IoT devices and fog layers, facilitating automatic application deployment. Next, an enhanced version of the equilibrium optimizer (EO) algorithm, inspired by physics, is designed for service placement in fog computing environments.FindingsSimulations reveal that the proposed approach surpasses existing methods, achieving a 99% success rate compared to the closest alternative’s 93%. The algorithm also significantly reduces waiting and planning times for service placement, proving its efficiency and effectiveness in optimizing IoT service deployment in fog computing.Research limitations/implicationsOne of the primary limitations is the computational complexity involved in dynamically adjusting to real-time changes in network conditions and IoT workloads. Although improved EO offers improvements in placement efficiency, it may not be fully optimized for highly fluctuating environments. Another important limitation is the uncertainty in node resources. Fog computing environments often face unpredictable changes in the availability and capacity of resources across nodes. This uncertainty can affect the algorithm’s ability to consistently make optimal decisions for IoT service placement.Practical implicationsFrom a practical perspective, the implementation of the proposed framework and the improved EO algorithm can drastically enhance the efficiency of IoT service deployment in fog computing systems. Organizations that rely on IoT networks, particularly those with critical real-time requirements, can benefit from reduced service placement times and lower failure rates. This can lead to better resource utilization, reduced operational costs and improved overall performance of IoT systems. The commercial impact is evident in industries such as smart cities, healthcare, where fast data processing is crucial.Social implicationsOur proposed framework has important implications for real-world IoT applications, particularly in areas requiring low latency processing, such as healthcare, smart cities. By reducing service delays and optimizing resource allocation, the framework can significantly improve the quality and reliability of services. Additionally, improved resource management leads to cost savings and better system efficiency, making the technology accessible to a wider range of applications.Originality/valueExisting resource placement strategies have shown inadequate performance, highlighting the need for more advanced algorithms. This study introduces a three-layer automatic framework for enhancing the application deployment of a fog system beside a novel improved EO algorithm to offer a robust solution for assigning IoT applications to fog nodes.
- Research Article
- 10.30574/ijsra.2024.13.1.1669
- Sep 30, 2024
- International Journal of Science and Research Archive
Integrating the Internet of Things (IoT) with big data analytics has created transformative opportunities across various domains, including smart cities, healthcare, and industrial automation. However, the challenges of extracting value from the vast and heterogeneous IoT data sets are significant. This study aims to explore methods for maximizing value extraction from IoT-generated big data and evaluate their impact on decision-making processes. A systematic literature review was conducted, and an exploratory qualitative methodology was employed to assess existing frameworks and propose improvements. The results highlight the importance of edge computing, machine learning algorithms, and data processing architectures in managing IoT data effectively. Additionally, the study identifies gaps in current research and suggests future directions to enhance the practical application of IoT big data analytics.
- Research Article
9
- 10.1109/jiot.2022.3187621
- Nov 15, 2022
- IEEE Internet of Things Journal
Cloud computing and edge computing models are popularly applied in emerging applications, such as smart homes, smart parks, and connected autonomous vehicles for large-scale live video analytics. Cloud computing-based models transfer all data to the cloud for video analytics, which burdens network bandwidth and increases the data transmission overhead. Edge computing mode enables video data to be processed at the edge node, thereby reducing the bandwidth overhead. Existing edge computing-based models optimize the performance, but they still have defects in three perspectives: 1) enabling end users to control video content in a real-time format; 2) efficiently locating and transferring the user region of interest (ROI) video data in the video stream; and 3) adapting to various network conditions. To tackle these challenges, we proposed an intelligent elastic edge framework for live video analytics, known as ElasticEdge. ElasticEdge enables the interaction between the end user and the edge node. Elasticity is reflected in two perspectives: 1) the dynamic changes of user requirements and 2) the dynamic changes in network conditions. In addition, ElasticEdge transmits the video stream to the end users based on the tradeoff between the amount of video data and users’ ROI to meet various network conditions. To validate ElasticEdge, we conducted experiments to study its performance in comparison to RTFace. The experimental results show that ElasticEdge has a significant edge over RTFace in terms of data transmission. Using 1/16 reserved images, ElasticEdge saves 75% bandwidth and reduces latency by approximately 10% compared with RTFace. We also find that ElasticEdge adapts to various network conditions when streaming videos, i.e., it can reliably obtain essential information with low latency even when the network condition is poor.
- Conference Article
4
- 10.5220/0009517401610170
- Jan 1, 2020
Recent trends have caused a shift from services deployed solely in monolithic data centers in the cloud to services deployed in the fog (e.g. roadside units for smart highways, support services for IoT devices). Simultaneously, the variety and number of IoT devices has grown rapidly, along with their reliance on cloud services. Additionally, many of these devices are now themselves capable of running containers, allowing them to execute some services previously deployed in the fog. The combination of IoT devices and fog computing has many advantages in terms of efficiency and user experience, but the scale, volatile topology and heterogeneous network conditions of the fog and the edge also present problems for service deployment scheduling. Cloud service scheduling often takes a wide array of parameters into account to calculate optimal solutions. However, the algorithms used are not generally capable of handling the scale and volatility of the fog. This paper presents a scheduling algorithm, named Swirly, for large scale fog and edge networks, which is capable of adapting to changes in network conditions and connected devices. The algorithm details are presented and implemented as a service using the Kubernetes API. This implementation is validated and benchmarked, showing that a single threaded Swirly service is easily capable of managing service meshes for at least 300.000 devices in soft real-time.
- Research Article
- 10.54216/jcim.140219
- Jan 1, 2024
- Journal of Cybersecurity and Information Management
A more extensive attack surface for cyber incursions has resulted from the fast expansion of Internet of Things (IoT) devices, calling for more stringent security protocols. This research introduces a new method for protecting Internet of Things (IoT) networks against intrusion assaults by combining Game Theory with Ant Colony Optimization (ACO). Various cyber dangers are becoming more common as a result of the networked nature and frequently inadequate security measures of IoT devices. Because these threats are ever-changing and intricate, traditional security measures can't keep up. An effective optimization method for allocating resources and pathfinding is provided by ACO, which takes its cues from the foraging behavior of ants, while Game Theory provides a strategic framework for modeling the interactions between attackers and defenders. Attackers and defenders in the proposed system are modeled as players in a game where the objective is to maximize their payout. Minimizing damage by anticipating and minimizing assaults is the defender's task. The monitoring pathways are optimized and resources are allocated effectively with the help of ACO. In response to changes in network conditions, the system dynamically modifies defensive tactics by updating the game model in real time. The results of the simulation show that the suggested method successfully increases the security of the Internet of Things. Compared to 87.4% using conventional approaches, the detection accuracy increased to 95.8%. From 10.5 seconds down to 7.3 seconds, the average reaction time to identified incursions was cut in half. Furthermore, there was a 20% improvement in resource utilization efficiency, guaranteeing that defensive and monitoring resources were allocated optimally. Internet of Things (IoT) network security is greatly improved by combining Game Theory with Ant Colony Optimization. In addition to enhancing detection accuracy and reaction times, this combination method guarantees resource efficiency. The results demonstrate the practicality of this approach, which offers a solid foundation for protecting Internet of Things devices from ever-changing cyber dangers.
- Research Article
8
- 10.1016/j.heliyon.2024.e30357
- Apr 25, 2024
- Heliyon
As the number of Internet users grows, the increase in smart devices interconnected through the Internet of Things (IoT) have contributed to improvements in the functionality of everyday products and enhancement of user experience. Yet, they affect user privacy and render personal data more vulnerable. To foster a digital future fully aware of user privacy requirements, a line of design research emerges that focuses on balancing product innovation with user data protection. This matter relates to sociocultural, economic, and technological aspects, and its core is a human-centered design strategy. Still, there is a gap in academic research oriented towards guiding product developers on how to consider personal data privacy concerns when designing honest IoT devices. To define this gap and delve deeper into this relevant topic, this paper presents a systematic literature review of recent academic research using the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) method. This review focuses on prevalent research topics such as data privacy, personal data, data surveillance, and user behaviour in IoT. The result is a state-of-the-art compilation of 45 scientific studies mapping the most relevant concepts and approaches for product development in the last ten years of research, aligned with some central research questions. The Discussion and Conclusion sections provide a deep understanding of the complexity of the fast-changing landscape of privacy and personal data management using IoT products. Finally, this study proposes future academic research directions devoted to providing product designer specific, specialised help from different (yet interconnected) scientific approaches.
- Research Article
5
- 10.1145/3442412
- Apr 23, 2021
- ACM Transactions on Software Engineering and Methodology
Blockchain offers a distributed ledger to record data collected from Internet of Thing (IoT) devices as immutable and tamper-proof transactions and securely shared among authorized participants in a Peer-to-Peer (P2P) network. Despite the growing interest in using blockchain for securing IoT systems, there is a general lack of systematic research and comprehensive review of the design issues on the integration of blockchain and IoT from the software architecture perspective. This article presents a catalog of architectural tactics for the design of IoT systems supported by blockchain as a result of a Systematic Literature Review (SLR) on IoT and blockchain to extract the commonly reported quality attributes, design decisions, and relevant architectural tactics for the architectural design of this category of systems. Our findings are threefold:<?brk?> (i) identification of security, scalability, performance, and interoperability as the commonly reported quality attributes; (ii) a catalog of twelve architectural tactics for the design of IoT systems supported by blockchain; and (iii) gaps in research that include tradeoffs among quality attributes and identified tactics. These tactics might provide architects and designers with different options when searching for an optimal architectural design that meets the quality attributes of interest and constraints of a system.
- Research Article
4
- 10.1016/j.sna.2023.114462
- May 27, 2023
- Sensors and Actuators A: Physical
Dynamic resonance frequency control for a resonant-type smooth impact drive mechanism actuator
- Research Article
- 10.52783/pmj.v35.i1s.2119
- Nov 13, 2024
- Panamerican Mathematical Journal
The proliferation of the Internet of Things (IoT) and its integration with wireless sensor networks (WSNs) necessitates advanced optimization techniques to enhance performance and resource allocation while ensuring reliability and energy efficiency. This study introduces a mathematical modeling approach utilizing a Multi-Objective Adaptive Neuro-Fuzzy Inference System (MO-ANFIS) designed for optimizing IoT-based WSNs. The proposed model synergistically combines the adaptive learning capabilities of neural networks with the reasoning prowess of fuzzy inference systems, encapsulated within a multi-objective framework to concurrently address key operational objectives such as minimizing energy consumption, maximizing network lifetime, and enhancing data accuracy and throughput. The mathematical model is formulated to dynamically adapt to changing network conditions and sensor inputs, enabling real-time tuning of fuzzy rules and membership functions through backpropagation neural training. This adaptability ensures optimal performance despite the variable nature of IoT environments. Simulation results demonstrate that the MO-ANFIS model significantly outperforms traditional optimization methods, offering a robust, scalable solution for complex, dynamic WSNs in the IoT landscape. The findings suggest promising applications in various domains, including smart cities, environmental monitoring, and healthcare, where IoT integration is pivotal. This research not only bridges the gap between theoretical fuzzy-neural frameworks and practical IoT applications but also sets a foundation for future explorations into intelligent, adaptive network management systems.
- Research Article
10
- 10.1155/2019/3674274
- Nov 18, 2019
- Security and Communication Networks
Blockchain mining should not be a game among power oligarchs. In this paper, we present the Multiple Winners Proof of Work Protocol (MWPoW), a mining-pool-like decentralised blockchain consensus protocol. MWPoW enables disadvantaged nodes which post only a small amount of calculation resource in the mining game to create blocks together and compete with power oligarchs without centralised representatives. A precise Support Rate of blocks can be determined through the mining process; the mechanism of the mainchain determination is therefore changed and has become faster and more straightforward. A method that periodically adjusts the block size and the block interval is introduced into MWPoW, which increases the system flexibility in the changes of network conditions and data flow. Experiments suggest, without lifting calculation and bandwidth requirements, MWPoW is more attractive to disadvantaged nodes due to its mostly increased reward expectation for disadvantaged nodes. The transaction pending time is shortened chiefly, and either the block interval or the block size can be adapted amid the changes of overall network conditions.
- Research Article
12
- 10.1109/tmc.2013.2296040
- Oct 1, 2014
- IEEE Transactions on Mobile Computing
The largest strength of contention-based MAC protocols is simultaneously the largest weakness of their scheduled counterparts: the ability to adapt to changes in network conditions. For scheduling to be competitive in mobile wireless networks, continuous adaptation must be addressed. We propose ATLAS, an Adaptive Topology- and Load-Aware Scheduling protocol to address this problem. In ATLAS, each node employs a random schedule achieving its persistence, the fraction of time a node is permitted to transmit, that is computed in a topology and load dependent manner. A distributed auction (REACT) piggybacks offers and claims onto existing network traffic to compute a lexicographic max-min channel allocation. A node's persistence p is related to its allocation. Its schedule achieving p is updated where and when needed, without waiting for a frame boundary.We study how ATLAS adapts to controlled changes in topology and load. Our results show that ATLAS adapts to most network changes in less than 0.1s, with about 20% relative error, scaling with network size. We further study ATLAS in more dynamic networks showing that it keeps up with changes in topology and load sufficient for TCP to sustain multi-hop flows, a struggle in IEEE 802.11 networks. The stable performance of ATLAS supports the design of higher-layer services that inform, and are informed by, the underlying communication network.
- Research Article
6
- 10.1108/k-09-2019-0621
- Jan 4, 2020
- Kybernetes
Purpose Fog computing (FC) is a new field of research and has emerged as a complement to the cloud, which can mitigate the problems inherent to the cloud computing (CC) and internet of things (IoT) model such as unreliable latency, bandwidth constraints, security and mobility. Because there is no comprehensive study on the FC in health management processing systems techniques, this paper aims at surveying and analyzing the existing techniques systematically as well as offering some suggestions for upcoming works. Design/methodology/approach The paper complies with the methodological requirements of systematic literature reviews (SLR). The present paper investigates the newest systems and studies their practical techniques in detail. The applications of FC in health management systems have been categorized into three major groups, including review articles, data analysis, frameworks and models mechanisms. Findings The results have indicated that despite the popularity of FC as having real-time processing, low latency, dynamic configuration, scalability, low reaction time (less than a second), high bandwidth, battery life and network traffic, a few issues remain unanswered, such as security. The most recent research has focused on improvements in remote monitoring of the patients, such as less latency and rapid response. Also, the results have shown the application of qualitative methodology and case study in the use of FC in health management systems. While FC studies are growing in the clinical field, CC studies are decreasing. Research limitations/implications This study aims to be comprehensive, but there are some limitations. This research has only surveyed the articles that are mined, according to a keyword exploration of FC health, FC health care, FC health big data and FC health management system. Fog-based applications in the health management system may not be published with determined keywords. Moreover, the publications written in non-English languages have been ignored. Some important research studies may be printed in a language other than English. Practical implications The results of this survey will be valuable for academicians, and these can provide visions into future research areas in this domain. This survey helps the hospitals and related industries to identify FC needs. Moreover, the disadvantages and advantages of the above systems have been studied, and their key issues have been emphasized to develop a more effective FC in health management processing mechanisms over IoT in the future. Originality/value Previous literature review studies in the field of SLR have used a simple literature review to find the tasks and challenges in the field. In this study, for the first time, the FC in health management processing systems is applied in a systematic review focused on the mediating role of the IoT and thereby provides a novel contribution. An SLR is conducted to find more specific answers to the proposed research questions. SLR helps to reduce implicit researcher bias. Through the adoption of broad search strategies, predefined search strings and uniform inclusion and exclusion criteria, SLR effectively forces researchers to search for studies beyond their subject areas and networks.
- Research Article
21
- 10.1109/tvt.2021.3135885
- Feb 1, 2022
- IEEE Transactions on Vehicular Technology
The ongoing growth of Internet of Things (IoT) traffic increasesthe emergence of Multi-Access Edge Computing (MEC), which can serve IoT devices to process edge missions properly. The power supply for IoT devices is addressed by energy harvesting (EH) technology. However, the energy arrival model obeys the unknown energy arrival process. In addition, due to the dynamical changes of network state, it is crucial to study the access control in the MEC-based green IoT system. In this paper, we propose an MEC-based access control strategy to maximize the system utility for the green IoT system, considering the spectral efficiency and the successful access ratio from the perspective of operators and users respectively. Specifically, we model the access control problem as a Constrained Markov Decision Process (CMDP), and develop a novel centralized reward-based experience replay deep convolutional Q network algorithm (RCQN) to attain the optimal access control for EH IoT devices. Additionally, to enrich the prior knowledge of access control strategy, we design a Long Short-Term Memory (LSTM) based energy prediction module to predict the energy state of IoT devices by learning the historical energy arrival information, whose output is taken as the state information of RCQN. Simulation results demonstrate that the proposed LSTM-based RCQN access control algorithm significantly improves the system utility with a proper reward design and training mechanism.
- Research Article
10
- 10.47992/ijmts.2581.6012.0129
- Feb 13, 2021
- International Journal of Management, Technology, and Social Sciences
India is a country that depends on agriculture, where about half the population relies heavily on agriculture for their livelihood. However, most of the practices undertaken in the agricultural process are not for profit and yield favorable. It should upgrade with current technologies to boost seed quality, check soil infertility, check the water level, environmental changes, and market price prediction, and achieve in agriculture sensitivity of faults and background understanding. The advancement in technology and developments is seen as a significant aspect in their financial development and agricultural production growth. The Internet of Things (IoT), Wireless Sensor Networks (WSN), and data analytics accomplish these upgrades. These technologies help in providing solutions to agricultural issues such as resource optimization, agricultural land monitoring, and decision-making support, awareness of the crop, land, weather, and market conditions for farmers. Smart agriculture is based on data from sensors, data from cloud platform storage and data from databases, all three concepts need to be implemented. The data are collected from different sensors and stored in a cloud-based back end support, which is then analyzed using proper analytics techniques, and then the relevant information is transferred to a user interface, which naturally supported the decision to conclude. The IoT applications mainly use sensors to monitor the situation, which collects a large size of data every time, so in the case of the Internet of Things (IoT) application, sensors contribute more. Data analytics requires data storage, data aggregation, data processing and data extraction. To retrieve data and information from database, we must use data mining techniques. It acts a significant position in the selection-making process on several agricultural issues. The eventual objective of data mining is to acquire information form data transform it for some advanced use into a unique human-comprehensible format. Big data's role in Agriculture affords prospect to increase the farmers' economic gain by undergoing a digital revolution in this aspect that we examine with precision. This paper includes reviewing a summary of some of the conference papers, journals, and books that have been going in favor of smart agriculture. The type of data required for smart farming system are analyzed and the architecture and schematic diagram of a proposed intelligent farming system are included. It also involves implementing different components of the smart farming system and integrating IoT and data analytics in the smart farming system. Based on the review, research gap, research agendas to carry out further research are identified.
- Research Article
- 10.1016/j.compbiomed.2025.110142
- Jun 1, 2025
- Computers in biology and medicine
The Internet of Medical Things (IoMT) is a network of interconnected medical devices and applications aiming to facilitate real-time data sharing and personalised patient care. IoMT devices collect vast amounts of data, which are then analysed using advanced computational methods. Real-time patient monitoring is crucial, particularly for people with chronic diseases and older adults. Moreover, traditional in-person monitoring by healthcare providers can be resource-intensive and time-consuming. By leveraging IoMT technology for remote patient monitoring (RPM), healthcare providers can improve service quality, reduce costs and enhance patient care. To evaluate the current state of knowledge and address research gaps in IoMT adoption for RPM, we conducted a thorough systematic literature review (SLR). This SLR aims to provide a comprehensive overview of existing research, identify knowledge gaps, and analyse the factors that influence IoMT adoption. It follows the Preferred Reporting Items for Systematic Reviews and Meta-analyses (PRISMA) protocol. PRISMA guidelines allow us to systematically evaluate and synthesise the current state of relevant literature. After analysing the theoretical models used in previous studies on IoMT adoption for RPM, UTAUT2 was identified as the most effective framework for technology adoption in this area. Additionally, this SLR has identified the key factors influencing the adoption of IoMT technology, including privacy, trust, security, and perceived risk, and suggested their inclusion in future studies by analysing and integrating the findings of other studies. As much of the current research focuses solely on patient viewpoints, our SLR points to the necessity of giving equal weight to the opinions of both patients and healthcare professionals. To create IoMT systems that are more effective and inclusive, these deficiencies must be filled. This study will benefit researchers, healthcare professionals, policymakers and technology developers by offering insights to inform decision-making, guide future research and aid the development of effective IoMT solutions for RPM.
- Ask R Discovery
- Chat PDF
AI summaries and top papers from 250M+ research sources.