Retracted on May 31, 2024: Managing Heterogeneous and Time-Sensitive IoT Applications through Collaborative and Energy-Aware Resource Allocation

  • Abstract
  • Literature Map
  • Similar Papers
Abstract
Translate article icon Translate Article Star icon

In the Internet of Things (IoT) environment, the computing resources available in the cloud are often unable to meet the latency constraints of time critical applications due to the large distance between the cloud and data sources (IoT devices). The adoption of edge computing can help the cloud deliver services that meet time critical application requirements. However, it is challenging to meet the IoT application demands while using the resources smartly to reduce energy consumption at the edge of the network. In this context, we propose a fully distributed resource allocation algorithm for the IoT-edge-cloud environment, which (i) increases the infrastructure resource usage by promoting the collaboration between edge nodes, (ii) supports the heterogeneity and generic requirements of applications, and (iii) reduces the application latency and increases the energy efficiency of the edge. We compare our algorithm with a non-collaborative vertical offloading and with a horizontal approach based on edge collaboration. Results of simulations showed that the proposed algorithm is able to reduce 49.95% of the IoT application request end-to-end latency, increase 95.35% of the edge node utilization, and enhance the energy efficiency in terms of the edge node power consumption by 92.63% in comparison to the best performances of vertical and collaboration approaches.

Similar Papers
  • Research Article
  • Cite Count Icon 15
  • 10.1155/2021/9942950
A Cross‐Domain Authentication Optimization Scheme between Heterogeneous IoT Applications
  • Jan 1, 2021
  • Wireless Communications and Mobile Computing
  • Shichang Xuan + 4 more

With the continuous enrichment of the Internet of Things (IoT) applications, the demand for value exchange and collaborative control between heterogeneous IoT applications is increasing. However, the user management space varies depending on the IoT application, where the security domain stands as an example. It is one of the key technologies of data sharing between heterogeneous IoT organizations to cross the boundary of the security domain and verify the identity and authority of users in other security domains. Aiming at the slow speed of authentication protocol authority authentication during cross‐domain access and without considering the actual cross‐domain situation, the same cryptographic system parameters are used for all communication nodes in a cross‐domain environment. This article proposes a heterogeneous Internet of Things data access authority authentication scheme between applications. Based on certificate‐less public key cryptography and smart contract technology, a certificate‐less cross‐domain authentication scheme that supports parameter differentiation is designed and implemented. The theoretical and empirical analyses, comparing the communication volume, identity signature, and verification calculation cost, validated that the method proposed improves the cross‐domain identity authorization authentication ability and supports the use of differentiated cryptographic system parameters among different IoT applications.

  • Research Article
  • Cite Count Icon 4
  • 10.1016/j.jksuci.2022.02.011
Hub-OS: An interoperable IoT computing platform for resources utilization with real-time support
  • Feb 25, 2022
  • Journal of King Saud University - Computer and Information Sciences
  • Desoky Abdelqawy + 3 more

Hub-OS: An interoperable IoT computing platform for resources utilization with real-time support

  • Conference Article
  • 10.1145/3479243.3494705
Tutorial: Edge Computing for Mobile Internet of Things
  • Nov 22, 2021
  • Rodolfo W L Coutinho

Internet of things (IoT) has emerged as the enabling technology for smart applications in different domains, such as transportation, health-care, industry, smart homes and buildings, and education (e.g., [1-5]). IoT applications rely on the deployment of resourceconstrained devices that collect data from the environment it is immersed and control events of interest through actuators. One of the daunting challenges in many IoT applications is the need for the real-time processing of a large amount of produced data. Such processing is often impractical to be performed at the IoT devices, due to their resource-constrained nature and the incurred energy cost. In this regard, IoT data is often offloaded to be processed on distant powerful cloud servers, which return to IoT devices as the result of the heavy computations. This approach is well-suited for computation-intensive tasks in IoT applications. However, the process of task offloading to cloud servers incurs additional delays for the IoT application, in addition to the network overhead. Therefore, edge computing has been proposed to provide computation, communication, and storage resources closer to IoT devices. The general idea is to place resources in the proximity of IoT devices that will demand them. Thus, the latency involved in the IoT application is reduced since computation-intensive tasks are processed on edge devices rather than on distant cloud servers. One of the critical challenges in edge-aided IoT applications is that edge devices have limited resource capabilities when compared to cloud servers. In this regard, edge devices' resources must be managed and allocated in an efficient way, aimed at providing resources to IoT applications with guaranteed quality of service (QoS). This tutorial will motivate and explore the challenges, design principles, and goals of edge computing for IoT applications. It presents the building blocks for the design of optimization models for IoT task offloading to edge nodes. By doing so, it discusses the communication challenges between IoT and edge devices and highlights the different mathematical formulations commonly used in the literature to model IoT to edge communication. Furthermore, this tutorial discusses optimization-based and machine learning (ML)-based solutions for tackling the task offloading decision problem. Besides, this tutorial presents recent advancements in resource management solutions aimed at efficient resource allocation at edge devices. Finally, this tutorial shall conclude with a discussion of research opportunities and challenges in the edge-assisted Internet of things.

  • Conference Article
  • Cite Count Icon 5
  • 10.1109/icc.2017.7996719
Secure multi-party data communications in cloud augmented IoT environment
  • May 1, 2017
  • Xueqing Huang + 1 more

In concert with advances of wireless technologies in facilitating internet connectivity of Internet of Things (IoT) devices, mobile edge computing can provision and distribute computing resources at the cloudlets to efficiently process a high volume of IoT data. Among the IoT applications, multi-party data sharing among IoT devices, wireless access nodes and cloudlets is becoming increasingly critical, not only because the data collected by each single IoT device will often stay unmined, but also because of the security concern. As IoT applications' dependence on the cloud environment grows, the rich resources at cloudlets often become the attack targets, and the IoT data that are stored or processed using the cloud resources will be jeopardized. For the internet of important things, we have investigated how to efficiently and securely share the data among multi-party. In particular, for a group of cooperative IoT devices, by leveraging the cloud resources available at the wireless access points, a secure cache site with fast data uploading rate is chosen for each user. To minimize the overall data downloading time, the multi-party multi-path data delivery scheme is also designed such that each user can efficiently retrieve the data belonging to other parties.

  • Book Chapter
  • 10.1201/9780429352898-6
A Comprehensive Overview of Blockchain-Driven IoT applications
  • Feb 17, 2021
  • Rajalakshmi Krishnamurthi + 1 more

In this technology era, the Internet of Things (IoT) plays predominant role in several aspects of real-life problem. The thrust area of IoT applications include smart city, smart transport, smart healthcare, smart environment, smart agriculture, etc. The IoT system involves heterogeneous sensors and devices at the edge layer of the protocol stack. These IoT devices are characterized in term of hardware specification, different communication protocols, services offered by these devices and trustable users of the IoT devices. Hence, the IoT requires efficient security and privacy mechanism for heterogeneous IoT devices and as well as voluminous amount of data generated by these IoT devices. Conventionally, the security for homogenous devices is provided by the centralized mechanism through authentication, encryption/decryption of data, digital signatures and cryptographic algorithms. However, IoT has characteristics of heterogeneous, autonomous and resource constraint IoT devices, in which case, the conventional centralized security solutions fail tremendously. Hence, the decentralized approach of Block chain is well suited for this scenario. This chapter targets to provide the comprehensive review of various security, privacy and access issues, solution approaches and challenges towards IoT. The role block chains in five different IoT applications are discussed in this chapter. These IoT applications include Intelligence Transport system, Smart Healthcare system, supply chain management, IoT ecosystem, Smart City. This chapter discusses how block chain provides security, smart contracts, access control and proof-of-work and proof-of-state are carried out at decentralized manner. The chapter presents the use case of block chain beyond crypto currency and bit coin, the data are to be preserved through block chain technique and different mining techniques used in block chain for these IoT applications.

  • Conference Article
  • Cite Count Icon 22
  • 10.1109/ccnc.2017.7983224
An efficient computation offloading architecture for the Internet of Things (IoT) devices
  • Jan 1, 2017
  • Raj Mani Shukla + 1 more

Proliferation of the connected Internet of things (IoT) devices and applications like augmented reality have resulted in a paradigm shift in computation requirement and power management of these devices. Furthermore, processing enormous amounts of data generated by ubiquitous IoT devices and meeting real-time deadline requirements of novel IoT applications exacerbate the challenges in IoT design. To address these challenges, in this paper, we propose a computation offloading architecture to process the huge amount of data generated by IoT devices while simultaneously meeting the real-time deadlines of IoT applications. In our proposed architecture, a resource-constrained IoT device requests a relatively resourceful computing device (e.g., a personal computer) in the same local network for computation offloading. Additionally, in our proposed computation offloading architecture, both client and server devices tune their tunable parameters, such as operating frequency and number of active cores, to meet the application's real-time deadline requirements. We compare our proposed computation offloading architecture with contemporary computation offloading models that use cloud computing. Experimental results verify that our proposed architecture provides a performance improvement of 21.4% on average as compared to cloud-based computation offloading schemes.

  • Book Chapter
  • Cite Count Icon 3
  • 10.4018/979-8-3693-7322-4.ch009
Amalgamation of Optimization Algorithms With IoT Applications
  • Jun 30, 2024
  • Vandana Dubey + 4 more

The integration of optimization algorithms with IoT (internet of things) applications presents numerous benefits and diverse applications. Optimization algorithms help enhance the efficiency, scalability, and cost-effectiveness of IoT systems. This powerful combination offers advantages such as improved resource allocation, reduced energy consumption, enhanced decision-making, and better resource utilization. It finds applications in smart cities, agriculture, healthcare, manufacturing, and more, optimizing traffic management, precision agriculture, healthcare resource allocation, and supply chain management, among others. In summary, the union of optimization algorithms with IoT unlocks a wide array of opportunities for optimizing processes, conserving resources, and improving the quality of services in various domains. Optimization algorithms are used to find the best solution to a given problem, and when applied to IoT, they can help in various ways, including improving resource allocation, energy efficiency, data analysis, and more. Here the authors discuss some ways in which optimization algorithms can be combined with IoT applications such as resource allocation, energy efficiency, data routing and processing, quality of service (QoS), improvement, etc. The choice of the specific optimization algorithm depends on the nature of the problem and the application. Algorithms like genetic algorithms, particle swarm optimization, simulated annealing, and machine learning techniques (e.g., deep reinforcement learning) can be applied to various IoT optimization problems. Basically, the combination of optimization algorithms with IoT applications can lead to more efficient, cost-effective, and reliable IoT systems across a wide range of domains. It's essential to carefully assess the specific requirements of your IoT application and select the appropriate optimization techniques to achieve the goals.

  • Research Article
  • Cite Count Icon 142
  • 10.1109/tcomm.2018.2870888
Resource Allocation for Ultra-Reliable and Enhanced Mobile Broadband IoT Applications in Fog Network
  • Jan 1, 2019
  • IEEE Transactions on Communications
  • Sarder Fakhrul Abedin + 5 more

In recent years, in order to provide a better quality of service (QoS) to Internet of Things (IoT) devices, the cloud computing paradigm has shifted toward the edge. However, the resource capacity (e.g., bandwidth) in fog network technology is limited and it is essential to efficiently bind the IoT applications with stringent QoS requirements with the available network infrastructure. In this paper, we formulate a joint user association and resource allocation problem in the downlink of the fog network, considering the evergrowing demand of QoS requirements imposed by the ultra-reliable low latency communications and enhanced mobile broadband services. First, we determine the priority of different QoS requirements of heterogeneous IoT applications at the fog network by enforcing the analytical framework using an analytic hierarchy process (AHP). Using the AHP, we then formulate a two-sided matching game to initiate stable association between the fog network infrastructure (i.e., fog devices) and IoT devices. Subsequently, we consider the externalities in the matching game that occurs due to job delay and solve the network resource allocation problem by applying the “best-fit” resource allocation strategy during matching. The simulation results illustrate the stability of the user association and efficiency of resource allocation with higher utility gain.

  • Research Article
  • Cite Count Icon 17
  • 10.1109/mcomstd.001.1900051
On Extending ETSI MEC to Support LoRa for Efficient IoT Application Deployment at the Edge
  • Jun 1, 2020
  • IEEE Communications Standards Magazine
  • Adlen Ksentini + 1 more

The Internet of Things (IoT) undergoes a rapid transformation this last decade, thanks to the appearance of low-power wide area network technologies, such as LoRa/LoRaWAN, SigFox, and narrowband IoT, which allow reducing the deployment cost of sensors and other IoT devices. Many emerging services such as smart city, Industry 4.0, and autonomous driving are based on IoT devices and applications to collect and analyze data and control end devices (i.e., actuators). Among these services, several IoT applications, such as data analytics, need to be deployed at the edge to either reduce the latency to access data or treat the high amount of generated data locally. However, in the context of LoRa/LoRaWAN, most of the current IoT service deployments run the applications at a central cloud to ease the integration with existing software as a service (SaaS) platforms, without exploiting the benefits of edge computing. In this article, we propose a new framework that leverages the ETSI multi-access edge computing (MEC) model to deploy LoRabased IoT applications at the edge. In particular, the proposed model takes advantage of the ETSI MEC features, such as dynamic deployment of an IoT application at the edge and application life cycle management. In addition, the proposed framework allows running an IoT application as a 5G network slice at the edge.

  • Research Article
  • Cite Count Icon 50
  • 10.1109/tnsm.2021.3123959
DeepEdge: A New QoE-Based Resource Allocation Framework Using Deep Reinforcement Learning for Future Heterogeneous Edge-IoT Applications
  • Dec 1, 2021
  • IEEE Transactions on Network and Service Management
  • Ismail Alqerm + 1 more

Edge computing is emerging to empower the future of Internet of Things (IoT) applications. However, due to heterogeneity of applications, it is a significant challenge for the edge cloud to effectively allocate multidimensional limited resources (CPU, memory, storage, bandwidth, etc.) with constraints of applications’ Quality of Service (QoS) requirements. In this paper, we address the resource allocation problem in Edge-IoT systems through developing a novel framework named <italic xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">DeepEdge</i> that allocates resources to the heterogeneous IoT applications with the goal of maximizing users’ Quality of Experience (QoE). To achieve this goal, we develop a novel QoE model that considers aligning the heterogeneous requirements of IoT applications to the available edge resources. The alignment is achieved through selection of QoS requirement range that can be satisfied by the available resources. In addition, we propose a novel two-stage deep reinforcement learning (DRL) scheme that effectively allocates edge resources to serve the IoT applications and maximize the users’ QoE. Unlike the typical DRL, our scheme exploits deep neural networks (DNN) to improve actions’ exploration by using DNN to map the Edge-IoT state to joint resource allocation action that consists of resource allocation and QoS class. The joint action not only maximize users’ QoE and satisfies heterogeneous applications’ requirements but also align the QoS requirements to the available resources. In addition, we develop a Q-value approximation approach to tackle the large space problem of Edge-IoT. Further evaluation shows that <italic xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">DeepEdge</i> brings considerable improvements in terms of QoE, latency and application tasks’ success ratio in comparison to the existing resource allocation schemes.

  • Book Chapter
  • Cite Count Icon 6
  • 10.1007/978-981-13-6351-1_23
An Insight into Time Synchronization Algorithms in IoT
  • Jan 1, 2019
  • Neha Dalwadi + 1 more

Nowadays, Internet of Things (IoT) is a new standard in the area of smart technologies. IoT comprises various kinds of objects, machines, and humans that can interact with each other using various communication technologies such as Wi-Fi, RFID (Radio-Frequency Identification), Bluetooth, NFC (Near-Field Communication), etc. IoT devices need interoperability and communication with each other. In IoT, synchronization is a method of adjusting the internal clock of a device with the clocks of other devices in the network. Time synchronization is required among IoT devices and applications for better communication and resource availability, ordering of events, proper allocation of available resources, and for mutual exclusion. In IoT, the sequence of exchange of signals needs to be maintained for the proper execution of various applications. An objective of this paper is to discuss and analyze time synchronization algorithms that may also be applied to implement effective synchronization among the nodes in IoT applications. There are a set of synchronization algorithms available based on different approaches. The paper presents a comparative study of time synchronization algorithms for IoT.

  • PDF Download Icon
  • Research Article
  • Cite Count Icon 18
  • 10.3390/s20092563
SimTalk: Simulation of IoT Applications
  • Apr 30, 2020
  • Sensors (Basel, Switzerland)
  • Yun-Wei Lin + 2 more

The correct implementation and behavior of Internet of Things (IoT) applications are seldom investigated in the literature. This paper shows how the simulation mechanism can be integrated well into an IoT application development platform for correct implementation and behavior investigation. We use an IoT application development platform called IoTtalk as an example to describe how the simulation mechanism called SimTalk can be built into this IoT platform. We first elaborate on how to implement the simulator for an input IoT device (a sensor). Then we describe how an output IoT device (an actuator) can be simulated by an animated simulator. We use a smart farm application to show how the simulated sensors are used for correct implementation. We use applications including interactive art (skeleton art and water dance) and the pendulum physics experiment as examples to illustrate how IoT application behavior investigation can be achieved in SimTalk. As the main outcome of this paper, the SimTalk simulation codes can be directly reused for real IoT applications. Furthermore, SimTalk is integrated well with an IoT application verification tool in order to formally verify the IoT application configuration. Such features have not been found in any IoT simulators in the world.

  • Research Article
  • Cite Count Icon 35
  • 10.1109/access.2020.3012458
Scalable Emulated Framework for IoT Devices in Smart Logistics Based Cyber-Physical Systems: Bonded Coverage and Connectivity Analysis
  • Jan 1, 2020
  • IEEE Access
  • Arbab Waseem Abbas + 1 more

In this research, scalable framework for Smart Logistics based Cyber-Physical System (SLCPS) is emulated for stable coverage and connectivity of Internet of Things (IoT) devices. This work is modern manifestation of three laws of computing. Moore's and Koomey's laws recommend performance gain and energy efficiency whereas Metcalfe's law imply network scalability. Combination of these laws suggests the research proposition that development of scalable and performance efficient IoT networks is inevitable. Although IoT has improved specific logistics modules considerably, but incorporation of IoT in complete supply chain of food and random placement of IoT devices due to which unstable coverage and connectivity occurred are major challenges in logistics. The proposed SLCPS framework is designed firstly, to develop apt IoT protocol stack for logistics. Secondly, for bonded connectivity and coverage, mathematical models are proposed instead of random placement and coverage map is based on binary coverage model. Thirdly, for scalability supply chain of food for smart logistics process is designed in terms of container, storehouse and warehouse comprising of varying number of IoT devices. The architecture of SLCPS framework has three modules i.e. internal IoT network, border router and external network, emulated in Cooja simulator. The contikimac protocol is used for efficient traffic flow and power consumption. Single hop, multiple hops and random IoT devices placement scenarios are used for results comparison and validation. The performance evaluation results, i.e. throughput, network convergence time, packet delivery ratio, average latency, power consumption and timeline investigation validated utilization of proposed framework in terms of enhanced network performance. Significance of proposed SLCPS framework results in cost minimization, reducing communication and computation overhead, resilience to IoT device failures and an interference free network connectivity and coverage. Coverage and connectivity are measure of quality of service in IoT network. Therefore, this research provided bonded coverage and connectivity in smart logistics using mathematical models. In addition, a baseline framework is provided for extended research in CPS and IoT applications.

  • Research Article
  • Cite Count Icon 58
  • 10.1016/j.adhoc.2017.05.001
Energy efficient context aware traffic scheduling for IoT applications
  • May 5, 2017
  • Ad Hoc Networks
  • Bilal Afzal + 3 more

Energy efficient context aware traffic scheduling for IoT applications

  • Conference Article
  • Cite Count Icon 6
  • 10.1109/vtc2021-spring51267.2021.9448772
The Implementation of a SIP-Based Service Platform for 5G IoT Applications
  • Apr 1, 2021
  • I-Fen Yang + 3 more

Internet of things (IoT) technologies have been applied to realize various applications/services, ranging from massive, broadband, critical to industrial automation IoT applications. Recently, several IoT service platforms have been proposed to facilitate IoT application deployment, such as OpenMTC and IoTtalk. These platforms typically utilize the lightweight MQTT or CoAP application protocol to communicate with their served devices, both optimized for massive IoT applications within constrained networks. Unfortunately, these protocols are not applicable to those more advanced IoT applications, which demand high data rate and low latency. To provide a full range of supports for different IoT application scenarios, Session Initiation Protocol (SIP) can be a better candidate, which, besides instant messaging, can also handle the long session semantic and the publish-subscribe semantic. In the literature, different SIP semantics have been utilized to implement different IoT applications/systems. However, to the best of our knowledge, there exists no generic SIP-based IoT service platform that integrates relevant capabilities for developers to flexibly develop/deploy heterogeneous IoT applications, demanding different quality of service. This paper implements a SIP-based IoT service platform, iSIPtalk, which enables rapid development of more advanced IoT applications. To justify the applicability of iSIPtalk, we deploy a real testbed and implement a vehicular service within our proposed iSIPtalk. Finally, via delay measurements using the real testbed and the vehicular service, we demonstrate the performance of our iSIPtalk.

Save Icon
Up Arrow
Open/Close