Edge computing and sustainable, low-power AI systems
The increasing use of artificial intelligence (AI) in smart environments, industrial automation, healthcare monitoring, and Internet of Things (IoT) networks has increased the computational requirements and power consumption of cloud computing infrastructure. The use of cloud computing for AI processing faces challenges such as high latency, high bandwidth consumption, privacy concerns, and environmental emissions due to the large number of data centers. This work introduces a sustainable and low-power AI model based on the smart edge architecture․ It includes energy-efficient machine learning algorithms deployed at the edge for making smart decisions․ This model achieves optimal performance using lightweight model deployment, dynamic resource allocation, and hardware-aware optimizations such as model quantization and hardware pruning techniques․ A modular architecture is proposed that increases efficiency by focusing on data acquisition, edge processing, AI inference, energy management, and cloud synchronization․ Performance metrics include latency, energy consumption, compute efficiency, and inference quality․ The latter, in particular, is important for providing efficient AI capabilities on devices with limited resources․ Results of experiments show that local processing reduces network and energy use compared to processing in the cloud․ The proposed framework enables scalable and efficient deployment of AI while minimizing the environmental impact and maintaining performance to support sustainable computing․
- Single Book
- 10.62311/nesx/rb978-81-981466-8-7
- Nov 30, 2024
Abstract: This book presents a comprehensive exploration of the convergence between Artificial Intelligence (AI) and Edge Computing as a transformative framework for optimizing Internet of Things (IoT) systems. It addresses the core challenge of deploying intelligent decision-making capabilities in latency-sensitive, bandwidth-constrained, and privacy-critical environments where traditional cloud-centric architectures prove insufficient. Through the integration of lightweight AI models, distributed inference, and federated learning, the book develops a conceptual and technical foundation for enabling scalable, real-time analytics at the edge. The methodology combines theoretical modeling with empirical case studies and performance benchmarking across diverse IoT applications, including smart cities, industrial automation, autonomous systems, and healthcare monitoring. Key AI techniques—such as reinforcement learning, spatiotemporal modeling, and knowledge distillation—are evaluated in the context of resource-constrained edge hardware. The book further explores secure and ethical deployment through privacy-preserving learning, encrypted model updates, and explainable AI frameworks aligned with international regulatory standards. Findings highlight significant improvements in system responsiveness, energy efficiency, and data sovereignty when AI is embedded at the edge. The book concludes with an assessment of emerging innovations—neuromorphic computing, quantum edge AI, and 6G-enabled edge-satellite intelligence—and proposes a cross-sectoral roadmap for future research, policy, and system design. This volume offers foundational insights for scholars, engineers, and policymakers navigating the evolution of intelligent, decentralized IoT infrastructures. Keywords Artificial Intelligence, Edge Computing, Internet of Things, IoT Optimization, Federated Learning, Real-Time Analytics, Edge AI, Reinforcement Learning, Spatiotemporal Modeling, Lightweight Neural Networks, Distributed Inference, Privacy-Preserving AI, Explainable AI, Smart Systems, Low-Latency Processing, Secure Edge Intelligence, 6G Networks, Neuromorphic Computing, Quantum AI, Autonomous IoT Systems
- Research Article
11
- 10.1016/j.comnet.2022.109262
- Oct 1, 2022
- Computer Networks
Bandwidth-efficient multi-task AI inference with dynamic task importance for the Internet of Things in edge computing
- Research Article
1
- 10.36347/sjet.2025.v13i03.003
- Mar 25, 2025
- Scholars Journal of Engineering and Technology
Although deep learning has progressed the field of artificial intelligence (AI), the traditional digital computing architectures are still limited by the von Neumann bottleneck. The continuous fetching and loading of the information results in high latency and excessive energy consumption, making AI optimization difficult. To address these issues, this paper proposes a solution that utilizes a hybrid analog-digital neural network processor by incorporating analog in-memory computing (AIMC) with digital computation for efficient AI model training and inference. The use of resistive random-access memory (RRAM) and electrochemical random-access memory (ECRAM) is harnessed for training since both allow data to be used as electrically programmable non-volatile memory, enabling data to be stored and processed without the need for constant transfers, thus increasing speed and reducing power use. For AI inference, phase-change memory (PCM) is used to perform the computations with the use of analog synaptic cells, which provides increase energy and processing efficiency. The new architecture is able to achieve greater computational efficiency along with low energy spending and increase processing speed by integrating the parallel processing capabilities of the analog memory and precision reading and writing of the digital processor, improving AI inference lag times. The results take AI workloads to be much more scalable and efficient outlined why the new architecture leads the standard digital processors with speed tests. This research outlines the prospects hybrid analog-digital processors which can change how next-gen AI systems with the ported compute like never before with limitless development.
- Research Article
2
- 10.3390/app16010225
- Dec 25, 2025
- Applied Sciences
This review examines the impact of data analytics powered by the Internet of Things (IoT), edge computing, and artificial intelligence (AI) on improving energy efficiency in smart environments, with a focus on smart factories, smart cities, and smart territories. Advanced AI, machine learning (ML), and deep learning (DL) techniques enable real-time energy optimization and intelligent decision-making in complex, data-intensive systems. Integrating edge computing reduces latency and improves responsiveness in IoT and Industrial Internet of Things (IIoT) networks, enabling local energy management and reducing grid load. Federated learning further enhances data privacy and efficiency by enabling decentralized model training across distributed smart nodes without exposing sensitive information or personal data. Emerging 5G and 6G technologies provide the necessary bandwidth and speed for seamless data exchange and control across energy-intensive, connected infrastructures. Blockchain increases transparency, security, and trust in energy transactions and decentralized energy trading in smart grids. Together, these technologies support dynamic demand response mechanisms, predictive maintenance, and self-regulating systems, leading to significant improvements in energy sustainability. Case studies of smart cities and industrial ecosystems within Industry 4.0/5.0/6.0 demonstrate measurable reductions in energy consumption and carbon emissions through these synergistic approaches. Despite significant progress, challenges remain in interoperability, scalability, and regulatory frameworks. This review demonstrates that AI-based edge computing, supported by robust connectivity and secure IoT and IIoT architectures, has a transformative potential for creating energy-efficient and sustainable smart environments.
- Research Article
- 10.71097/ijsat.v16.i1.6696
- Jan 8, 2025
- International Journal on Science and Technology
The rapid growth of artificial intelligence (AI) has transformed the landscape of computation, particularly in sectors requiring real-time processing and intelligent decision-making at the edge of networks. Edge computing has emerged as a compelling alternative to traditional cloud-based systems by enabling low-latency, localized data processing. However, the inherent resource limitations of edge devices, including constrained memory, computational power, and energy availability, pose substantial challenges for executing AI inference workloads. To overcome these barriers, researchers have increasingly turned to customized digital hardware solutions such as application-specific integrated circuits (ASICs), field-programmable gate arrays (FPGAs), and neural processing units (NPUs). These hardware accelerators are specifically tailored to meet the unique demands of AI inference tasks, offering significant improvements in energy efficiency, throughput, and latency. Customized digital hardware is designed to optimize specific computational patterns found in AI workloads, such as matrix multiplications and activation functions in neural networks. By streamlining operations and eliminating unnecessary general-purpose processing overhead, these platforms can deliver orders of magnitude improvements in performance per watt compared to traditional CPUs or GPUs. ASICs, for instance, provide unparalleled energy efficiency and throughput when optimized for fixed-function inference tasks. FPGAs offer reconfigurability, enabling designers to tailor the data flow and logic structure for diverse AI models, which is particularly beneficial in applications requiring flexibility and model updates. NPUs, purpose-built for deep learning, integrate dedicated tensor processing elements that accelerate the execution of convolutional and fully connected layers in neural networks. This paper presents a comprehensive investigation into the deployment of customized digital hardware for accelerating AI inference on edge devices. It evaluates the performance trade-offs among ASICs, FPGAs, and NPUs through benchmarking experiments involving representative edge AI workloads. The methodology includes selection of real-world AI models, such as MobileNet and Tiny-YOLO, and deployment across commercially available edge hardware platforms. Key performance metrics, including inference latency, energy consumption, throughput, and model accuracy, are analyzed to assess the effectiveness of each hardware category. The results demonstrate that customized hardware not only improves inference speed and energy efficiency but also significantly enhances the feasibility of deploying sophisticated AI models on low-power, real-time edge devices. While ASICs lead in performance and power efficiency, FPGAs offer crucial adaptability for evolving workloads, and NPUs strike a balance between specialization and integration in modern system-on-chip architectures. The discussion also addresses practical considerations such as design complexity, cost, and integration challenges. Through this comparative study, the paper aims to guide hardware designers, AI practitioners, and system architects in selecting and optimizing digital hardware for edge AI inference. Ultimately, the research highlights that the co-design of AI algorithms and hardware architectures is essential for meeting the growing demand for intelligent, decentralized systems.
- Book Chapter
7
- 10.1007/978-3-031-15160-6_3
- Aug 9, 2022
In the next few years, smart environments are expected to originate billions of raw Internet of Things (IoT) data that need to be stored and processed in order to implement a variety of control and monitoring services. While complex and long-term processing typically relies on remote cloud facilities, low-latency and interactive cognitive services may highly benefit from caching and computation resources, as well as artificial intelligence (AI) components, deployed at the network edge, close to where data are produced. Therefore, edge caching will play a pivotal role for the efficient and effective deployment of smart and cognitive environments, including houses and buildings. In this chapter, we scan the literature related to edge caching for IoT smart environments and identify the most promising decision policies together with the key benefits and open challenges. Conventional caching techniques are first scanned, before delving into more disruptive in-network caching solutions built upon the named data networking (NDN) paradigm. Focus will be then on the possible interplay of NDN-based edge caching policies with software-defined networking (SDN), as well as on the opportunities to leverage edge caching powered by AI techniques as a prominent sixth-generation (6G) enabler.KeywordsEdge cachingInternet of thingsCognitive environmentsInformation-centric networkingNamed data networkingSoftware-defined networking
- Research Article
330
- 10.1109/access.2020.3047960
- Dec 30, 2020
- IEEE Access
Smart health care is an important aspect of connected living. Health care is one of the basic pillars of human need, and smart health care is projected to produce several billion dollars in revenue in the near future. There are several components of smart health care, including the Internet of Things (IoT), the Internet of Medical Things (IoMT), medical sensors, artificial intelligence (AI), edge computing, cloud computing, and next-generation wireless communication technology. Many papers in the literature deal with smart health care or health care in general. Here, we present a comprehensive survey of IoT- and IoMT-based edge-intelligent smart health care, mainly focusing on journal articles published between 2014 and 2020. We survey this literature by answering several research areas on IoT and IoMT, AI, edge and cloud computing, security, and medical signals fusion. We also address current research challenges and offer some future research directions.
- Research Article
477
- 10.1145/3555802
- Jan 16, 2023
- ACM Computing Surveys
Recent years have witnessed the widespread popularity of Internet of things (IoT). By providing sufficient data for model training and inference, IoT has promoted the development of artificial intelligence (AI) to a great extent. Under this background and trend, the traditional cloud computing model may nevertheless encounter many problems in independently tackling the massive data generated by IoT and meeting corresponding practical needs. In response, a new computing model called edge computing (EC) has drawn extensive attention from both industry and academia. With the continuous deepening of the research on EC, however, scholars have found that traditional (non-AI) methods have their limitations in enhancing the performance of EC. Seeing the successful application of AI in various fields, EC researchers start to set their sights on AI, especially from a perspective of machine learning, a branch of AI that has gained increased popularity in the past decades. In this article, we first explain the formal definition of EC and the reasons why EC has become a favorable computing model. Then, we discuss the problems of interest in EC. We summarize the traditional solutions and hightlight their limitations. By explaining the research results of using AI to optimize EC and applying AI to other fields under the EC architecture, this article can serve as a guide to explore new research ideas in these two aspects while enjoying the mutually beneficial relationship between AI and EC.
- Research Article
- 10.65521/ijacte.v12i1.105
- Apr 15, 2025
- International Journal on Advanced Computer Theory and Engineering
Edge computing has emerged as a transformative paradigm, bringing data processing, storage, and analytics closer to end devices and users. This shift addresses the limitations of traditional cloud computing, such as high latency, bandwidth constraints, and data privacy concerns. This paper presents a comprehensive overview of recent advancements in edge computing, focusing on its architectural frameworks, technological enablers, and integration with other emerging technologies, such as 5G, artificial intelligence (AI), and the Internet of Things (IoT). Key challenges, including resource management, security, and scalability, are examined, along with innovative solutions proposed in contemporary research. Furthermore, the paper explores the growing opportunities in sectors such as healthcare, smart cities, autonomous vehicles, and industrial automation. By synthesizing current research trends, this study aims to provide valuable insights for researchers, practitioners, and policymakers in harnessing the full potential of edge computing for the next generation of intelligent applications.
- Research Article
5
- 10.55041/ijsrem37970
- Oct 16, 2024
- INTERANTIONAL JOURNAL OF SCIENTIFIC RESEARCH IN ENGINEERING AND MANAGEMENT
The Internet of Things (IoT) is a rapidly growing technological paradigm that enables the interconnection of physical devices, sensors, and systems through the internet. IoT allows devices to collect, exchange, and act on data autonomously, creating seamless communication between the digital and physical worlds. It has applications across diverse domains, including smart homes, healthcare, transportation, agriculture, and industrial automation. IoT ecosystems involve key components like sensors, actuators, cloud platforms, communication protocols, and analytics tools. The technology's benefits include improved efficiency, real-time monitoring, automation, and enhanced decision- making through data analytics. However, IoT also presents challenges related to data security, privacy, interoperability, and scalability. As the number of connected devices continues to grow, advancements in 5G, artificial intelligence, and edge computing are expected to drive the evolution of IoT, making it a cornerstone of future smart environments. This abstract provides a concise overview of the IoT landscape, highlighting both its transformative potential and associated challenges. Keywords: IOT (Internet of Things) , Smart Devices, Sensors and Actuators, Connectivity, Real-time Monitoring, Automation, Cloud Computing, Edge Computing, 5G Networks, Interoperability.
- Research Article
1
- 10.38177/ajbsr.2025.7201
- Jan 1, 2025
- Asian Journal of Basic Science & Research
Edge computing and artificial intelligence (AI) are transforming real-time data analytics in smart devices by enabling low-latency, efficient, and decentralized processing. Traditional cloud-based approaches introduce high latency, bandwidth constraints, and security risks, making them less viable for real-time applications. This research explores Edge AI architectures, optimization techniques, and their integration with IoT to enhance real-time decision-making in smart devices. We analyze various AI models, including lightweight neural networks, federated learning, and quantization techniques, to optimize computational performance while maintaining energy efficiency and accuracy. Additionally, we evaluate security concerns and propose a blockchain-integrated trust management system for safeguarding edge-based AI deployments. Experimental results demonstrate that Edge AI significantly reduces latency by up to 45%, improves bandwidth utilization by 30%, and enhances real-time inference accuracy for applications such as healthcare monitoring, industrial automation, and smart city infrastructure. This study provides a comprehensive evaluation of Edge AI’s role in real-time analytics and offers future directions for scalable and secure edge intelligence.
- Book Chapter
1
- 10.5772/intechopen.113173
- Jan 31, 2024
The Internet of Things (IoT) has the potential to revolutionize energy management by enabling the collection and analysis of real-time data from various energy sources. This research paper investigates the impact of the Internet of Things (IoT) on energy management. The paper provides an overview of IoT and its potential applications in energy management, including improved efficiency, reduced costs, and better resource utilization. The benefits of using IoT for energy management and the major challenges that may arise in implementing IoT-enabled energy management are discussed. Potential solutions to these challenges, such as artificial intelligence and cloud computing, are presented, along with case studies of IoT-enabled energy management in different industries. The paper also analyzes the impact of IoT on energy efficiency in telecommunications and cloud infrastructure. Finally, the future outlook for IoT and energy management is discussed, including potential developments in edge computing, advanced analytics, and 5G networks. Overall, this paper highlights the potential of IoT to revolutionize energy management and provides insights into the challenges and opportunities of implementing IoT-enabled energy management solutions.
- Research Article
378
- 10.1016/j.iotcps.2023.02.004
- Jan 1, 2023
- Internet of Things and Cyber-Physical Systems
Artificial Intelligence (AI) at the edge is the utilization of AI in real-world devices. Edge AI refers to the practice of doing AI computations near the users at the network's edge, instead of centralised location like a cloud service provider's data centre. With the latest innovations in AI efficiency, the proliferation of Internet of Things (IoT) devices, and the rise of edge computing, the potential of edge AI has now been unlocked. This study provides a thorough analysis of AI approaches and capabilities as they pertain to edge computing, or Edge AI. Further, a detailed survey of edge computing and its paradigms including transition to Edge AI is presented to explore the background of each variant proposed for implementing Edge Computing. Furthermore, we discussed the Edge AI approach to deploying AI algorithms and models on edge devices, which are typically resource-constrained devices located at the edge of the network. We also presented the technology used in various modern IoT applications, including autonomous vehicles, smart homes, industrial automation, healthcare, and surveillance. Moreover, the discussion of leveraging machine learning algorithms optimized for resource-constrained environments is presented. Finally, important open challenges and potential research directions in the field of edge computing and edge AI have been identified and investigated. We hope that this article will serve as a common goal for a future blueprint that will unite important stakeholders and facilitates to accelerate development in the field of Edge AI.
- Research Article
- 10.71143/j7dtvn26
- Jan 17, 2026
- International Journal of Research and Review in Applied Science, Humanities, and Technology
The rapid proliferation of Internet of Things (IoT) devices has led to the generation of massive volumes of heterogeneous data across smart environments such as smart cities, healthcare, agriculture, transportation, and industrial automation. Traditional rule-based and static decision-making mechanisms are increasingly inadequate to handle the scale, complexity, and dynamic nature of IoT ecosystems. Artificial Intelligence (AI), particularly machine learning and deep learning techniques, has emerged as a transformative enabler for intelligent, autonomous, and adaptive decision-making in IoT-based smart systems. This paper proposes a comprehensive AI-driven framework for intelligent decision making in IoT-based smart systems, integrating data acquisition, preprocessing, intelligent analytics, and automated action layers. The framework leverages supervised, unsupervised, and reinforcement learning models to extract actionable insights from real-time and historical IoT data. A modular architecture is presented, supporting scalability, interoperability, and real-time responsiveness. Experimental evaluation across representative smart system use cases demonstrates improved decision accuracy, reduced latency, and enhanced system efficiency. The study further discusses challenges related to data privacy, model interpretability, and resource constraints, and outlines future research directions toward explainable AI and edge intelligence.
- Research Article
- 10.55041/ijsrem49345
- Jun 2, 2025
- INTERNATIONAL JOURNAL OF SCIENTIFIC RESEARCH IN ENGINEERING AND MANAGEMENT
- Smart sensors are at the forefront of a major transformation in instrumentation engineering. These advanced devices go beyond simple data collection—they are capable of processing information, drawing insights, and communicating results without human intervention. When combined with the connectivity of the Internet of Things (IoT) and the analytical power of Artificial Intelligence (AI), smart sensors enable real-time monitoring, early fault detection, and adaptive control in a wide range of applications. From remote patient monitoring in healthcare to precision farming in agriculture and energy optimization in smart cities, the integration of these technologies is redefining the way systems are designed and operated. This paper explores the foundational principles, system architectures, and key use cases of smart sensor networks. It also addresses practical challenges such as energy efficiency, data security, and system interoperability. By examining both current capabilities and future possibilities, this study provides a clear understanding of how smart sensors, empowered by IoT and AI, are reshaping the future of intelligent instrumentation. Key Words: Smart Sensors, Instrumentation, Internet of Things (IoT), Artificial Intelligence (AI), Edge Computing, Wireless Sensor Networks, Real-time Monitoring etc.