People Counting Application with Crowded Scenarios: A Case Study with TV Boxes as Edge Devices
Counting people in various urban spaces using artificial intelligence enables a wide range of smart city applications, enhancing governance and improving citizens' quality of life. However, the rapid expansion of edge computing for these applications raises concerns about the growing volume of electronic waste. To address this challenge, our previous work demonstrated the feasibility of repurposing confiscated illegal TV boxes as Internet of Things (IoT) edge devices for machine learning applications, specifically for people counting using images captured by cameras. Despite promising results, experiments in crowded scenarios revealed a high Mean Absolute Error (MAE). In this work, we propose a patching technique applied to YOLOv8 models to mitigate this limitation. By employing this technique, we successfully reduced the MAE from 8.77 to 3.77 using the nano version of YOLOv8, converted to TensorFlow Lite, on a custom dataset collected at the entrance of a university restaurant. This work presents an effective solution for resource-constrained devices and promotes a sustainable approach to repurposing hardware that would otherwise contribute to electronic waste.
- Conference Article
- 10.1145/3479243.3494705
- Nov 22, 2021
Internet of things (IoT) has emerged as the enabling technology for smart applications in different domains, such as transportation, health-care, industry, smart homes and buildings, and education (e.g., [1-5]). IoT applications rely on the deployment of resourceconstrained devices that collect data from the environment it is immersed and control events of interest through actuators. One of the daunting challenges in many IoT applications is the need for the real-time processing of a large amount of produced data. Such processing is often impractical to be performed at the IoT devices, due to their resource-constrained nature and the incurred energy cost. In this regard, IoT data is often offloaded to be processed on distant powerful cloud servers, which return to IoT devices as the result of the heavy computations. This approach is well-suited for computation-intensive tasks in IoT applications. However, the process of task offloading to cloud servers incurs additional delays for the IoT application, in addition to the network overhead. Therefore, edge computing has been proposed to provide computation, communication, and storage resources closer to IoT devices. The general idea is to place resources in the proximity of IoT devices that will demand them. Thus, the latency involved in the IoT application is reduced since computation-intensive tasks are processed on edge devices rather than on distant cloud servers. One of the critical challenges in edge-aided IoT applications is that edge devices have limited resource capabilities when compared to cloud servers. In this regard, edge devices' resources must be managed and allocated in an efficient way, aimed at providing resources to IoT applications with guaranteed quality of service (QoS). This tutorial will motivate and explore the challenges, design principles, and goals of edge computing for IoT applications. It presents the building blocks for the design of optimization models for IoT task offloading to edge nodes. By doing so, it discusses the communication challenges between IoT and edge devices and highlights the different mathematical formulations commonly used in the literature to model IoT to edge communication. Furthermore, this tutorial discusses optimization-based and machine learning (ML)-based solutions for tackling the task offloading decision problem. Besides, this tutorial presents recent advancements in resource management solutions aimed at efficient resource allocation at edge devices. Finally, this tutorial shall conclude with a discussion of research opportunities and challenges in the edge-assisted Internet of things.
- Research Article
- 10.1007/s40012-019-00256-5
- Jul 3, 2019
- CSI Transactions on ICT
With a growing interest in the Internet of Things (IoT), the businesses are undergoing a revolution in the way they monitor and control. In the recent past, many applications were developed using the IoT system architectures in various business verticals such as industrial, healthcare, farming, transportation, etc. However, with the development of low-complex artificial intelligence frameworks which are capable of operating on the edge devices, the IoT architectures have taken a major leap leading to the Internet of Intelligent Things (IoIT). In this paper, we discuss our focused research and applications of IoIT across the following verticals: (1) healthcare, (2) smart buildings, (3) farming, along with the recent state of the art methodologies and future challenges. Under the healthcare, we laid our main focus on the development of AI enabled computer-aided diagnosis framework, which acquires the scanning information from a wireless ultrasound transducer and automatically identifies the abnormalities present. Also, we developed a low-complex brain-controlled IoT environments framework which automatically classifies the motor imagery task performed by the user using 22 channel electroencephalography. Using IoIT, we developed a novel non-invasive technology capable of monitoring various electrical parameters without any need for cutting the wires. This developed non-invasive power monitor is capable of generating real-time alerts in the case of system malfunctions and will be a key enabler for smart buildings. Also, we developed mathematical, simulation, and experimental models for analyzing the performance of channel access mechanisms of dense traffic IoT networks.
- Research Article
1
- 10.1177/14727978251346026
- Jun 10, 2025
- Journal of Computational Methods in Sciences and Engineering
The Internet of Things (IoT) is an essential component of the digital age, particularly in ensuring reliable and efficient operations as well as the timely and precise recognition of anomalies within IoT systems. However, anomaly detection in time-series data—especially data collected by edge devices—poses several challenges, including concerns over data privacy and communication overhead. To address these issues, this research proposes a novel deep learning (DL)-based anomaly detection model capable of being trained in real-time on edge devices within an IoT environment. The framework ensures user privacy by enabling distributed edge devices to cooperatively train an anomaly recognition system. This study introduces a new, Intelligent Shark Smell Tuned Deep Isolation Forest (ISS-DIF), which effectively detects anomalies and identifies outliers in industrial IoT sensor data with high accuracy. The hybrid model isolates anomalies rather than profiling normal data, making it particularly suitable for identifying rare anomalies within large datasets. Industrial data are collected from real-world manufacturing environments using IoT edge devices. Following data collection, median filtering is applied to reduce noise, and min-max scaling is employed for data normalization. The ISS component allows for fine-tuning the hyperparameters of the Deep Isolation Forest (DIF) to optimize detection performance. The DIF model is tailored to enhance anomaly detection capabilities in sensor data from industrial IoT applications. For validation, 80% of the dataset was used for training, while the remaining 20% served as the test set. The results demonstrate that the proposed ISS-DIF framework achieved superior training accuracy of 99.30%, with precision of 97.89%, recall of 98.76%, and an F1-score of 97.21%, outperforming the testing metrics. This approach integrates real-time anomaly detection with the processing capabilities of edge devices, thereby improving IoT data analytics and providing a scalable, efficient solution that preserves privacy in irregularity recognition within IoT environments.
- Research Article
1
- 10.1038/s41598-025-97379-4
- Jul 2, 2025
- Scientific Reports
In recent years, a large number of illegal TV box devices have been confiscated in Brazil. According to a news report released in March 2024, an estimated 2.5 million TV boxes were stored in the warehouses of the Federal Revenue Service. Typically, these devices are destroyed, which not only incurs significant costs for the government but also generates substantial e-waste. Meanwhile, the advancement of smart city applications based on the Internet of Things (IoT) and machine learning has driven research in edge computing using hardware-constrained devices. This paper explores the feasibility of repurposing TV boxes for edge computing in applications involving people counting in images collected by cameras. We developed a testbed consisting of 20 TV boxes to conduct a thorough evaluation of their resilience and carbon footprint compared to commonly used edge computing equipment. Our findings demonstrate that these repurposed devices can outperform commercially available devices in terms of carbon footprint when using the Brazilian energy matrix, a conclusion drawn after performing over 16 million inferences during a stress test. Specially, the most modern TV box with the lightest model version was the best option in terms of average inferences per day, reliability, and carbon footprint. This study underscores the innovative potential and environmental benefits of repurposing TV boxes for smart city applications, especially when utilizing lightweight machine learning models.
- Research Article
40
- 10.1109/jiot.2018.2888636
- Jun 1, 2019
- IEEE Internet of Things Journal
Nowadays in many application scenarios of Internet of Things (IoT), low latency is achieved at the cost of computing-complexity which is beyond the capabilities of IoT devices. Offloading the computing intensive tasks to more powerful edge devices is expected to provide new generation computing-intensive and delay-sensitive services. In the three hierarchy architecture user/IoT-edge-cloud, private and secure mutual authentication are necessary between user, IoT device, and edge device. However, in the emerging computing paradigms, such as mobile transparent computing, edge computing, fog computing, and several threats, such as edge device compromise, privacy leaking, and denial of service (DoS) might crash the security of the system. Here, we propose a lightweight anonymous mutual authentication scheme for ${n}$ -times computing offloading (CO) in IoT. In our novel scheme, through a smartcard as token and an edge device as a security proxy, a user is able to subscribe or renew ${n}$ -times CO service and consume it securely in daily use. Moreover, both IoT and edge devices authenticate each other anonymously without leaking user’s sensitive information, which will preserve the privacy even when an edge device is comprised. Finally, our scheme is based on lightweight one-way hash function and MAC function, therefore the adversary is not able to perform a DoS attack. To evaluate the solution, a security analysis and a performance analysis are presented. Compared with similar schemes, our approach achieves all designed security features and achieves a $1.66\boldsymbol {\times }$ and $2.87\boldsymbol {\times }$ of computing speed on IoT and edge devices, respectably.
- Research Article
- 10.62019/abbdm.v4i4.279
- Dec 31, 2024
- The Asian Bulletin of Big Data Management
The Internet of Medical Things (IoMT), an application of the Internet of Things (IoT) in the medical domain, allows data to be transmitted across communication networks. In particular, IoMT can help improve the quality of life. With the advent of the Medical Internet of Things or MIOT, billions of people's health, safety, and care are being improved. Rather than requiring patients to visit the hospital for assistance, their health-related parameters can be tracked remotely, continuously, and in real-time., which significantly improves the effectiveness, convenience, and cost performance of healthcare. Data transmission over communication networks is made possible by the Internet of Medical Things (IOMT), an application of the Internet of Things (IoT) in the medical field. Specifically, IOMT can enhance citizens' and senior citizens' quality of life by tracking and controlling the vital signals of the body, such as heart rate, temperature, and blood pressure, among others. IOMT has emerged as the primary forum for exchange. A total of 187 articles in all, published between 2010 and 2022, are gathered and arranged based on the variety of applications, year of publication, type of applications, and other unique viewpoints. This study provides a broad overview of the state-of-the-art methods by reviewing the security and privacy issues, requirements, risks, and future research objectives in the field of IOMT.
- Book Chapter
7
- 10.1007/978-3-030-13929-2_11
- Jan 1, 2019
An active infiltration of information technology in the healthcare sector has led to a fundamental change in people’s quality of life. Networked medical and healthcare devices and their applications are already creating an Internet of Medical Things which is aimed at better health monitoring and preventive care. But the new concepts and applying of new technologies bring certain risks including failures of devices, infrastructure which may lead to the worst outcome. In this regard, the security and safety problems of this technology using increase rapidly. This paper touches upon the issue of the healthcare Internet of Things (IoT) infrastructure failures and attacks on components and complete system. The purpose of the paper is to develop and research the availability models of a healthcare IoT system regarding failures and attacks on components. A detailed analysis of an architecture of healthcare IoT infrastructure is given. The main causes of the healthcare IoT based system failures are considered. This paper presents an approach to develop a Markov models set for a healthcare IoT infrastructure that allows considering safety and security issues. Much attention is given to developing and research of the Markov model of a healthcare IoT system considering failures of components. The analysis of obtained simulation results showed the rates that have the greatest influence on the availability function of the healthcare IoT system. In addition, it is presented a case study with a game theoretical approach to select countermeasure tools.
- Book Chapter
- 10.1007/978-981-16-3728-5_25
- Sep 14, 2021
In recent years, abundant amounts of data have been accumulated from a huge network of Internet of things (IoT) devices spread around the globe. The collected data is only useful if it creates an action. To forge data actionable, it needs to be broadened with context and creativity. Traditional methods of evaluating structured data and creating action do not contribute to efficiently process the massive amounts of real-time data that stream from IoT devices. The study has shown that most of the IoT gadgets offering cloud storage along with analytics either trade the data or are lost dumped with no use. For instance, consider the trillions of log files that contain metadata, timestamps of a smart bulb which seems useless if used by nobody. But, it is always important to correlate the data with similar data patterns in a different application that helps in forecasting an insight into possible outcomes. Hence, there is a huge scope for improvement in this realm which motivated us to perform experiments and prove the concept with rigid conclusions. This is where AI-based analysis and response become crucial for extracting optimal value from that data. Also the research involved contains sensible prescriptive analysis offering hindsight when one talks about the edge or node devices in the IoT scenario but certainly, it lacks the rigid structure for offering insight and foresight. An in-depth insight at the edge level can be conceived by the existing artificial intelligence building models offered by many IT giants such as AWS Greengrass. Thus, there is an immense need to process the edge device data with enough intelligence and use existing analytics tools to greatly enhance the performance of the cloud and improve overall IoT application in hand by making the cloud requirements less CPU intensive and more economic. In this paper, a model for predictive and prescriptive analysis to improve production capabilities, gain efficiencies, and reduce operating costs by delving into edge computing to produce actionable insight and foresight is demonstrated with the help of a practical experiment.KeywordsEdge computingInternet of thingsAWS GreengrassCloud computingPrescriptive analysisInternet of things (IoT)Artificial intelligenceRaspberry PiMPU6050
- Research Article
- 10.1088/1755-1315/1396/1/012027
- Sep 1, 2024
- IOP Conference Series: Earth and Environmental Science
In modern times, the concept of quality of life (QoL) has been a focal point in numerous studies, offering solutions to challenges faced by residents in new cities worldwide, including Egypt. To ensure citizens enjoy a high quality of life, cities are increasingly leveraging innovative technologies to address various aspects such as the environment, physical health, mobility, social interaction, psychological well-being, and the economy. Among these technologies, the Internet of Things (IoT) plays a significant role in enhancing people’s lives. By utilizing IoT as an information technology tool, cities can tackle their unique challenges. Consequently, the IoT method and the information obtained through Big Data (BD) analysis will enhance cities sustainability, safety, and liveability for residents. This research aims to explore the integration of IoT and BD for enhancing the quality of life in Egyptian cities. A qualitative methodology is employed to achieve this goal. Initially, a comprehensive literature review is conducted to uncover the relationship between improving quality of life in Egyptian cities and the use of IoT and BD methods. Additionally, a case study of Busan city, which has successfully implemented several IoT technologies to enhance the well-being of its inhabitants, is presented and analyzed. The findings from both the literature review and the case study highlight the positive correlation between the adoption of IoT and BD technologies and the overall quality of life in cities, spanning dimensions such as transportation, economy, social aspects, and the environment.
- Research Article
52
- 10.3390/en13184813
- Sep 15, 2020
- Energies
In recent years, people have witnessed numerous Internet of Things (IoT)-based attacks with the exponential increase in the number of IoT devices. Alongside this, the means to secure IoT-based applications are maturing slower than our budding dependence on them. Moreover, the vulnerabilities in an IoT system are exploited in chains to penetrate deep into the network and yield more adverse aftereffects. To mitigate these issues, this paper gives unique insights for handling the growing vulnerabilities in common IoT devices and proposes a threat architecture for IoT, addressing threats in the context of a three-layer IoT reference architecture. Furthermore, the vulnerabilities exploited at the several IoT attack surfaces and the challenges they exert are explored. Thereafter, the challenges in quantifying the IoT vulnerabilities with the existing framework are also analyzed. The study also covers a case study on the Intelligent Transportation System, covering road transport and traffic control specifically in terms of threats and vulnerabilities. Another case study on secure energy management in the Smart Grid is also presented. This case study covers the applications of Internet of Vulnerable Things (IoVT) in Smart energy Grid solutions, as there will be tremendous use of IoT in future Smart Grids to save energy and improve overall distribution. The analysis shows that the integration of the proposed architecture in existing applications alarms the developers about the embedded threats in the system.
- Research Article
24
- 10.1016/j.future.2021.07.010
- Jul 14, 2021
- Future Generation Computer Systems
Self-aware distributed deep learning framework for heterogeneous IoT edge devices
- Research Article
2
- 10.3390/app15062968
- Mar 10, 2025
- Applied Sciences
Primary batteries are extensively employed as power sources in Internet of Things (IoT) devices for remote metering. However, primary batteries maintain a relatively consistent discharge voltage curve over a long period before experiencing a full discharge, making it challenging to predict the battery’s life. In this study, we introduce a battery life prediction method to ensure the robust operation of IoT devices in remote metering applications. The robust battery life prediction process is divided into two stages. The first stage involves predicting the state of charge (SOC) to enable real-time remote monitoring of the battery status of metering devices. In the second stage, IoT devices implement a hardware-based alerting mechanism to provide warnings prior to complete discharge, leveraging a custom-designed Multi-Stage Discharge battery architecture. In the first stage, we developed the CNN-Series Decomposition Transformer (C-SDFormer) model, which is capable of accurately predicting the SOC of primary batteries. This model was specifically designed to support the real-time monitoring of battery status in large-scale IoT deployments, enabling proactive maintenance and enhancing system reliability. To validate the performance of the C-SDFormer model, data were collected from smart remote meters installed in households. The model was trained using the collected data and evaluated through a series of experiments. The performance of the C-SDFormer model was compared with existing methods for SOC prediction. The results indicate that the C-SDFormer model outperformed the traditional methods. Specifically, the SOC prediction achieved a mean absolute error (MAE) of less than 4.1%, a root mean square error (RMSE) of less than 5.2%, a symmetric mean absolute percentage error (SMAPE) of less than 7.0%, and a coefficient of determination (R2) exceeding 0.96. These results demonstrate the effectiveness of the C-SDFormer model in accurately predicting the SOC of primary batteries. For the second stage, a Multi-Stage Discharge (MSD) primary battery was developed to ensure a hardware-based low battery alert before the battery is fully discharged. This battery was designed to ensure the reliable operation of IoT devices, especially those whose batteries are not proactively managed through real-time monitoring in the first stage. By providing a low battery alert, the MSD battery reduces the risk of unexpected device shutdowns. This feature enhances the overall reliability of IoT devices, ensuring their continuous operation in remote metering applications.
- Research Article
49
- 10.1016/j.resconrec.2012.08.003
- Sep 17, 2012
- Resources, Conservation and Recycling
Electronic and electrical waste management in Sri Lanka: Suggestions for national policy enhancements
- Research Article
- 10.18535/raj.v4i04.227
- Apr 28, 2021
- Research and Analysis Journal
The Internet of Things (IoT) has emerged as one of the most transformative technologies of the 21st century, revolutionizing how industries operate and how devices interact within interconnected ecosystems. IoT enables billions of smart devices to collect, process, and share data, fostering unprecedented innovation across sectors like healthcare, manufacturing, smart cities, and transportation. However, the rapid expansion of IoT ecosystems has given rise to significant challenges in managing the vast volume, velocity, and variety of data generated by these devices. Traditional approaches to data management and processing often fall short, particularly in environments requiring real-time responsiveness, seamless scalability, and reliable decision-making.Integrating Artificial Intelligence (AI) with advanced data engineering techniques offers a powerful solution to these challenges. AI brings capabilities such as machine learning, predictive analytics, and intelligent decision-making, which, when combined with robust data engineering practices, enable efficient streaming data management. This integration supports real-time data processing, anomaly detection, predictive maintenance, and dynamic resource optimization, which are essential for creating intelligent IoT systems. By leveraging tools like real-time data pipelines, edge computing, and distributed architectures, AI-driven data engineering frameworks address critical issues, including data latency, resource constraints, and system scalability.This article delves into the intricate relationship between AI and data engineering within IoT ecosystems, focusing on streaming data management for smart devices. It explores the technical and theoretical underpinnings of integrating these fields, providing a comprehensive framework for optimizing IoT data streams. Key methodologies include employing machine learning algorithms to analyze real-time data, using edge computing to preprocess data closer to its source, and implementing scalable data pipelines for continuous processing.The findings of this study underscore the transformative potential of combining AI and data engineering in IoT ecosystems. Through experimental simulations and case studies, the research demonstrates how this integration enhances data flow efficiency, reduces latency, and improves the overall performance of IoT systems. For instance, in healthcare, AI-powered IoT devices enable real-time patient monitoring and predictive analytics, leading to improved medical outcomes. Similarly, in smart cities, integrated systems streamline traffic management, reduce energy consumption, and enhance public safety.This integration represents a paradigm shift in IoT ecosystems, laying the groundwork for intelligent, adaptive systems capable of meeting the demands of rapidly evolving industries. The study not only highlights the technological advancements enabled by this synergy but also identifies challenges such as integration complexity, resource limitations on edge devices, and the need for enhanced data privacy measures. Ultimately, this article serves as a blueprint for researchers, practitioners, and industry stakeholders aiming to unlock the full potential of IoT by bridging the gap between AI and data engineering.
- Conference Article
- 10.1145/3041021.3054770
- Jan 1, 2017
Internet of Everything (IoE) devices have different operation principles, which weakens the network scalability and data interoperability. Virtualization is an economic way of solving this problem. The data-collected by different vendors' sensors-can share the same computing program encapsulated by the Virtual Machine (VM), thus neglecting the physical-layer difference. To eliminate the extreme cost and long delay of transferring VMs to the remote cloud, the Edge Device (ED) preliminary processes its running VMs. Currently, the recycling of IoE devices become a major dilemma for individuals, since it is not simply a matter of concern for environmental damage or a solution to an environmental problem. Therefore, the sustainable strategy for recycling EDs is an important way to safeguard the network sustainability. To improve the recycling efficiency, most of the EDs should be upgraded simultaneously during one batch by migrating their running VMs to others for the service continuity. We investigate the least upgrade batch for recycling EDs in IoE networks. A two-step algorithm called MSBP (Minimized upgrade batch VM Scheduling and Bandwidth Planning) is designed to minimize the number of upgrade batches. Because migrating VM brings the bandwidth consumption along trajectories, MSBP has two strategies-Shortest Trajectory First (STF) and Least Bandwidth Utilization First (LBUF)-of allocating bandwidth and trajectories. The simulation results show that: 1) MSBP has the optimal recycling efficiency (least number of upgrade batches) for EDs; 2) LBUF more effectively mitigates the phenomenon where VM migration trajectories compete for the common link bandwidth, thus achieving a lower negative impact of path contention level on the recycling efficiency; 3) the battery power is not exhausted for the ED functioned as the sensor head of data transferring, thus prolonging the network lifetime. In summary, our solution well improves the network, social, economic and ecological sustainability.
- Ask R Discovery
- Chat PDF
AI summaries and top papers from 250M+ research sources.