Research on the Design of Power Supply Gateway and Wireless Power Transmission Based on Edge Computing
There is a rising need for sensors under an IoT network to identify and monitor the environment as more IoT devices and services are made accessible for use. This movement presents challenges such as the proliferation of data and the scarcity of energy. This research presents a strategy for enhancing the service provision capabilities of WSN-aided IoT applications by combining mobile edge computation with wireless signal and control transmission. In order to reduce overall system energy consumption while maintaining data transmission rate and power needs, a new optimization problem integrating power allocation, CPU frequency, offloading weight factor, and energy harvesting is devised. The non-convex nature of the problem necessitates the development of a novel ideal solution group iterative process optimization model that divides the original problem into multiple subproblems, with each subproblem being optimized in turn. According to the results of simulations with a numerical model, our proposed method consumes considerably less energy than just the two benchmark methodologies.
- Research Article
2
- 10.1016/j.comnet.2024.110846
- Oct 9, 2024
- Computer Networks
UAV-mounted IRS assisted wireless powered mobile edge computing systems: Joint beamforming design, resource allocation and position optimization
- Conference Article
9
- 10.1109/wf-iot.2018.8355212
- Feb 1, 2018
Internet of Things (IoT) devices are powered by an independent power supply like battery and energy harvester, which provide limited energy. Batteries require charging and replacement due to their limited lifetime. Energy harvesters have a semi-permanent lifetime but are environmentally constrained and irregularly supplied, which can cause power failure problems. Reducing the energy consumption of IoT devices can alleviate problems caused by independent power supplies. In general, the memory of IoT device is used for storing programs and data, executing tasks, and so on. The consistent access to the memory occurs while the IoT device is operating. Thus, the energy savings in accessing memory reduce the average IoT device energy consumption. This can help alleviate the problem of independent power supplies. In this paper, we analyze the energy consumption pattern according to memory mapping of tasks using a FRAM-based embedded device. We also analyzed the energy consumption of the low-power mode according to the memory mapping. Considering the overhead of data migration depending on the task, we confirmed that an average of 50% energy saving is possible when the proper memory mapping is selected.
- Research Article
171
- 10.1109/mnet.011.2100097
- Jul 1, 2021
- IEEE Network
Satellite networks can provide Internet of Things (IoT) devices in remote areas with seamless coverage and downlink multicast transmissions. However, the large transmission latency, serious path loss, as well as the energy and resource constraints of IoT terminals challenge the stringent service requirements for throughput and latency in the 6G era. To address these problems, technologies including space-air-ground integrated networks (SAGINs), machine learning, edge computing, and energy harvesting are highly expected in 6G IoT. In this article, we consider the unmanned aerial vehicles (UAVs) and satellites to offer wireless-powered IoT devices edge computing and cloud computing services, respectively. To accelerate the communications, Terahertz frequency bands are utilized for communications between UAVs and IoT devices. Since the tasks generated by terrestrial IoT devices can be conducted locally, offloaded to the UAV-based edge servers or remote cloud servers through satellites, we focus on the computation offloading problem and consider deep learning techniques to optimize the task success rate considering the energy dynamics and channel conditions. A deep-learning-based offloading policy optimization strategy is given where the long short-term memory model is considered to address the dynamics of energy harvesting performance. Through the theoretical explanation and performance analysis, we discover the importance of emerging technologies including SAGIN, energy harvesting, and artificial intelligence techniques for 6G IoT.
- Research Article
7
- 10.1016/j.future.2024.107527
- Sep 12, 2024
- Future Generation Computer Systems
UAV-IRS-assisted energy harvesting for edge computing based on deep reinforcement learning
- Conference Article
11
- 10.1109/wowmom49955.2020.00035
- Aug 1, 2020
Edge computing is a promising paradigm to expand the capability of Internet of Things (IoT) devices by computation offloading. To establish a distributed ledger to provide a secure and trusted environment for the resource allocation between edge servers and IoT devices, the emerging blockchain technology has attracted a lot of attention recently. However, in practice, edge resource allocation in IoT devices often involves multi-layer structures, which poses a challenge due to information incompleteness among different layers. Moreover, how to design a suitable and efficient blockchain framework for hierarchical resource allocation markets is a critical issue. In this paper, we apply blockchain to propose a secure and efficient hierarchical resource allocation framework for edge computing. First, we study the edge computing resource allocation problem in the hierarchical market of IoT devices, in which the IoT devices beyond the coverage of Access Points can participate in the resource allocation through middlemen. To solve the problem, a smart contract-based hierarchical auction mechanism is developed. The edge computing resources allocated in the top market can be continually reallocated to the sub-markets based on the mechanism, which then leads an efficient solution that maximizes the social welfare of the whole participants. Moreover, the mechanism is implemented as a smart contract in the blockchain, which enforces the rule of the hierarchical auction in a non-deniable and automated manner. Finally, the extensive simulations demonstrate the correctness and performance of the proposed mechanism.
- Research Article
5
- 10.1016/j.phycom.2022.101967
- Dec 12, 2022
- Physical Communication
Computation bits maximization in backscatter-aided wireless powered MEC using binary offloading
- Research Article
16
- 10.1177/15501477211035332
- Jul 1, 2021
- International Journal of Distributed Sensor Networks
Edge computing brings down storage, computation, and communication services from the cloud server to the network edge, resulting in low latency and high availability. The Internet of things (IoT) devices are resource-constrained, unable to process compute-intensive tasks. The convergence of edge computing and IoT with computation offloading offers a feasible solution in terms of performance. Besides these, computation offload saves energy, reduces computation time, and extends the battery life of resource constrain IoT devices. However, edge computing faces the scalability problem, when IoT devices in large numbers approach edge for computation offloading requests. This research article presents a three-tier energy-efficient framework to address the scalability issue in edge computing. We introduced an energy-efficient recursive clustering technique at the IoT layer that prioritizes the tasks based on weight. Each selected task with the highest weight value offloads to the edge server for execution. A lightweight client–server architecture affirms to reduce the computation offloading overhead. The proposed energy-efficient framework for IoT algorithm makes efficient computation offload decisions while considering energy and latency constraints. The energy-efficient framework minimizes the energy consumption of IoT devices, decreases computation time and computation overhead, and scales the edge server. Numerical results show that the proposed framework satisfies the quality of service requirements of both delay-sensitive and delay-tolerant applications by minimizing energy and increasing the lifetime of devices.
- Research Article
12
- 10.1109/tnse.2021.3086007
- Jul 1, 2021
- IEEE Transactions on Network Science and Engineering
For many Internet of Things (IoT) applications, the freshness of status information is of great importance, and age of information (AoI) is a newly proposed metric to quantify the freshness of system status. However, in many cases, the original raw data collected by IoT devices needs to be preprocessed in real-time to extract the hidden effective information, which is usually computationally intensive and time consuming. To this end, we promote an edge computing assisted approach and aim to reduce the AoI by flexibly offloading the raw IoT data to the edge server for information preprocessing. We consider that the IoT devices can opportunistically collect extra energy through energy harvesting for sustainable operations, and propose a novel timely system status update model that consists of multiple IoT devices with energy harvesting and edge-assisted information preprocessing. The objective is to minimize the system-wide average AoI under a fixed energy cost budget. To tackle the key challenges due to the unpredictability of the stochastic energy harvesting process and the long-term energy constraints, we propose a Lyapunov-based average AoI Minimization (LAoIM) algorithm to derive an approximate optimal solution, and further quantify the performance gap from the optimal solution. Extensive numerical evaluations demonstrate that LAoIM can take full advantages of local and edge computation resources and achieve superior performance gain over existing schemes.
- Research Article
- 10.62441/nano-ntp.vi.3325
- Nov 5, 2024
- Nanotechnology Perceptions
In evolving technology, the combination of vehicular ad-hoc networks (VANETs) architecture with Internet of Things (IoT) devices in intelligent transportation systems (ITSs) added a new dimension for reliable data communication. The challenge of secured IoT node addition while mitigating eavesdropping, denial of service (DoS), and malware attacks remains a concern in establishing reliable communication links. The VANET-IoT (V-IoT) system is developed in this research to support scalable communication with increasing IoT devices and to evade further risks to device usage without compromising security. Ensuring secured node authentication in V-IoT facilitates the admittance of newer IoT devices to the VANET infrastructure is the main objective of the research. Conventional methods fail to maintain a trade-off between threat detection and authentication via edge networks to ensure secure and reliable communication. To mitigate such challenges, the present research develops a multi-layered V-IoT infrastructure to incorporate crucial components of reliability and security like authentication, IoT devices, edge computing, and threat classification using the combination of conditional decision trees (CDT) and hunting search (HS). The former acts as an authenticator, and the latter acts as a threat classifier, which identifies eavesdropping, denial of service (DoS), and malware attempts in V-IoT systems. Further, edge computing is leveraged to integrate the CDT-HS into the V-IoT system for authentication of IoT devices and threat detection. The proposed method is evaluated in terms of various parameters like edge and V-IoT processing and scalability using different metrics like throughput, processing time, CPU and memory utilization, detection rate, latency, energy consumption, and communication overhead. The results show a significant improvement in ensuring reliable and secured communication in V-IoT systems.
- Research Article
18
- 10.1109/jsyst.2021.3049629
- Jan 27, 2021
- IEEE Systems Journal
Edge computing provides cloud services at the edge of the network for Internet of Things (IoT) devices. It aims to address low latency of the network and alleviates data processing of the cloud. This “cloud-edge-device” paradigm brings convenience as well as challenges for location-privacy protection of the IoT. In the edge computing environment, the fixed edge equipment supplies computing services for adjacent IoT devices. Therefore, edge computing suffers location leakage as the connection and authentication records imply the location of IoT devices. This article focuses on the location awareness in the edge computing environment. We adopt the “deniability” of authentication to prevent location leakage when IoT devices connect to the edge nodes. In our solution, an efficient deniable authentication based on a two-user ring signature is constructed. The robustness of authentication makes the fixed edge equipment accept the legal end devices. Besides, the deniability of authentication cannot convince any third party that the fact of this authentication occurred as communication transcript is no longer an evidence for this connection. Therefore, it handles the inherent location risk in edge computing. Compared to efficient deniable authentications, our protocol saves 10.728% and 14.696% computational cost, respectively.
- Research Article
38
- 10.1016/j.engappai.2020.103585
- Feb 28, 2020
- Engineering Applications of Artificial Intelligence
Artificial intelligence techniques empowered edge-cloud architecture for brain CT image analysis
- Research Article
4
- 10.1016/j.vehcom.2023.100656
- Aug 9, 2023
- Vehicular Communications
Computation bits enhancement for IRS-assisted multi-UAV wireless powered mobile edge computing systems
- Research Article
1
- 10.47495/okufbed.1037534
- Dec 12, 2022
- Osmaniye Korkut Ata Üniversitesi Fen Bilimleri Enstitüsü Dergisi
With rapid increase in numbers of connected Internet of Things (IoT) devices, huge amount of data is generated and sent to Cloud Computing nodes to be stored and analysed. Cloud computing is an effective paradigm for storage and data analysis since IoT devices are restricted machines in terms of energy, computation power and storage. Despite the advantages of cloud computing, it causes network congestion and latency due to generally located at long distances. Besides, security and privacy issues are also drawbacks of the cloud. Edge Computing is a promising system to eliminate the flaws of cloud computing by getting computational power closer to data sources. Edge Computing has more computation power than IoTD but lower than cloud computing. Although the deficiencies of cloud computing decrease with edge computing, they are not completely eliminated because computation intensive tasks still should be sent from edge to cloud resources. Since Autoencoder is an unsupervised neural network technique that learns to efficiently encode/compress input data and learns to efficiently decode it as closer to the original input, it is an ideal candidate for reducing data traffic and latency in edge computing and cloud computing. Instead of sending all data to the cloud, the data of bottleneck hidden layers in which input data is encoded are sent from edge to cloud. The compressed data is decoded on the cloud to reconstruct the original input to be analysed and learnt. In this paper, we investigate the studies using AE in edge computing and their performance implications with respect to network traffic and delay. The performance results of the proposals that have used autoencoder between edge and cloud layer are evaluated in terms of eliminating big data, network traffic and accuracy.
- Research Article
13
- 10.1109/jiot.2023.3241577
- Jun 15, 2023
- IEEE Internet of Things Journal
In this study, we first present a framework that jointly optimize energy harvesting and information decoding for Internet of Things (IoT) devices, which are capable of simultaneous wireless information and power reception, in a smarty city. In particular, a generalized power splitting receiver for IoT devices is designed, where each antenna in the receiver has an independent power splitter, unlike the existing works in which only one power splitter is employed regardless of the number of antennas in the receiver. Such a receiver design can provide a great degree of freedom to improve the network performance. Based on the presented framework, for each IoT device, we formulate an optimization problem whose objective is to maximize the harvested energy of each IoT device while satisfying its data rate requirement. To solve this problem, we propose a double-deep deterministic policy gradient based online learning algorithm which enables each IoT device to jointly determine receive beamforming and power splitting ratio vectors in real-time. Further, each IoT device can implement the proposed algorithm in a distributed manner using only its local channel state information. As such, cooperation and information exchange among the base stations and IoT devices are not necessary when performing the proposed algorithm at IoT devices. The extensive simulation results show the validity of the proposed algorithm.
- Conference Article
41
- 10.1109/vlsid.2018.110
- Jan 1, 2018
This paper presents a proof of concept for selfpowered Internet of Things (IoT) device, which is maintenance free and completely self-sustainable through energy harvesting. These IoT devices can be deployed in large scale and placed anywhere as long as they are in range of a gateway, and as long as there is sufficient light levels for the solar panel, such as indoor lights. A complete IoT device is designed, prototyped and tested. The IoT device can potentially last for more than 5 months (transmission interval of 30 seconds) on the coin cell battery (capacity of 120mAh) without any energy harvesting, sufficiently long for the dark seasons of the year. The sensor node contains ultra-low power sensors for temperature, humidity and light levels, with the possibility of adding several more sensors.