- Research Article
1
- 10.1186/s43067-025-00245-6
- Aug 4, 2025
- Journal of Electrical Systems and Information Technology
- Avijit Chowdhury
Abstract The advent of blockchain technology has achieved notable progress regarding security, particularly within the realm of e-commerce. The existing Web 2.0 framework, which employs inadequate security measures, exhibits vulnerabilities when compared to the robust security features of blockchain technology. The utilization of monitors, computers, and data storage exemplifies the functionality of blockchain technology, which upholds encrypted and distributed transaction records across multiple computers, consequently improving the reliability of the digital ledger. In a nation such as Bangladesh, where transaction data is susceptible to cyber threats and online fraud is prevalent within the e-commerce sector, this type of decentralized system has the potential to alter the landscape significantly. This requires the implementation of a more comprehensive security protocol. This research advocates for the adoption of smart contracts to enhance supply chain transparency and offers digital identification solutions aimed at preventing fraud, including issues related to non-delivery and counterfeit goods. This research utilizes Next.js for front-end development and facilitates backend integration through Solidity and Hardhat.js, specifically for the Solana Blockchain, deployed on an Amazon EC2 instance. This research commenced with an examination of the current e-commerce ecosystem, physical identification infrastructure, and consumer attitudes, ultimately presenting a strategic implementation plan for the adoption of blockchain technology to enhance trust and assurance within the e-commerce landscape of Bangladesh. It further delineates particular obstacles to adoption: technological limitations, regulatory challenges, socio-economic factors, and the expanding digital payments landscape, particularly concerning mobile financial services. This research enhances the current understanding of blockchain as a transformative force in emerging e-commerce markets and provides valuable insights into technology policies relevant to the developing economy of Bangladesh for policymakers, businesses, and technologists. Graphical abstract
- Research Article
- 10.1186/s43067-025-00250-9
- Aug 4, 2025
- Journal of Electrical Systems and Information Technology
- Daniel Voskergian + 1 more
Abstract This paper presents a model-centric prototyping framework for developing cloud-based mobile applications to optimize energy consumption and promote sustainable behavior. Integrating behavioral models such as the Heuristic Model, Geller’s Model and Fogg’s Behavior Model, the framework transitions users from passive consumers to active participants in energy conservation. Leveraging Mobile Cloud Computing (MCC), the system overcomes mobile device limitations by offloading resource-intensive tasks to the cloud, enhancing energy efficiency and user experience. The proposed system integrates smart devices, cloud-based services, and a user-centered application. Functional and non-functional requirements were derived from behavioral models and stakeholder perspectives. Usability testing validated the system’s design, achieving a System Usability Scale score of 84.6%, which corresponds to an “A” rating—indicating excellent usability and strong potential for long-term adoption. The system demonstrates its potential to align consumer energy savings with grid stability and efficiency. This research showcases the impact of combining behavioral science with advanced mobile cloud technologies to foster sustainability.
- Research Article
1
- 10.1186/s43067-025-00246-5
- Aug 1, 2025
- Journal of Electrical Systems and Information Technology
- Saroj Kumar Panda + 1 more
Abstract In the discipline of power system engineering, predicting power system demand is essential. This is because accurate forecasting models provide the foundation for the majority of system planning and operation tasks. The primary purpose of entire power infrastructures is to supply and support energy consumption. As a result, building reliable and effective predictive models is essential to delivering precise load predictions. One method of forecasting, short-term load forecasting (STLF) is used in this research, and machine learning like deep neural network (DNN) is the method used here for the analysis of STLF. To improve the overall forecasting and address the challenges posed by some category predictors, new predictive variables are added. Based on the choice of input sample and root mean square error (RMSE), the DNN comparison is carried out. To confirm the findings and determine whether or not these models are statistically equivalent, statistical tests are run. The findings show that the DNN model is statistically the same and appropriate for STLF. Further, for the reduction of RMSE value, this study used the gradient descent method as an optimization technique with DNN and the best RMSE values for STLF are 0.0322, 0.0970, 0, 0.0087, 0.0141, and 0.0204, respectively, as compared to without the use of an optimization technique.
- Research Article
- 10.1186/s43067-025-00248-3
- Jul 31, 2025
- Journal of Electrical Systems and Information Technology
- Abbas Ghori
Abstract A loss function is one of the key components considered in machine learning as they steer the model toward the optimal performance by quantifying the discrepancy between the predicted outcome and the actual outcome. They predominantly act as guiding principles for any optimization algorithm, thereby influencing both the convergence characteristics as well as the generalization of the model. This paper discusses classical loss functions, including mean squared error (MSE), cross-entropy, Huber loss, and a range of other problem-specific variants that appear in scenarios dealing with imbalanced data, adversarial learning, and reinforcement learning. The advantages and limitations of these methods with respect to robustness, convergence speed, and computational efficiency are discussed. The applications are illustrated in crucial areas such as vision, NLP, and anomaly detection to reflect real-world relevance. The paper attempts to cover some of the upcoming trends in adaptive and meta-learned loss functions, emphasizing their prospects to ameliorate learning efficacy and interpretability of trained models. This review, integrating theoretical insight with practical implications, will help to better equip researchers and practitioners in choosing the appropriate loss function for their work, thereby serve the goal of developing more autonomous and efficient applications.
- Research Article
6
- 10.1186/s43067-025-00239-4
- Jul 31, 2025
- Journal of Electrical Systems and Information Technology
- Ian B Benitez + 1 more
Abstract With climate change driving the global push toward sustainable energy, the reliability of power systems increasingly depends on accurate forecasting methods. This study examined the role of machine learning (ML) in forecasting solar PV power output (SPVPO) and wind turbine power output (WTPO) and identified the challenges posed by the intermittent nature of these renewable energy sources. This study examined the current techniques, challenges, and future directions in ML-based forecasting of SPVPO and WTPO and proposed a standardized framework. Using the Mann–Whitney and Kruskal–Wallis tests, the results highlight the significant impact of key meteorological and operational variables on enhancing forecasting accuracy, as measured by MAPE and R-squared. Key features for SPVPO forecasting include solar irradiance, ambient temperature, and prior SPVPO, while wind speed, turbine speed, and prior wind power output are crucial for WTPO forecasting. Moreover, ensemble models, support vector machines, Gaussian processes, hybrid artificial neural networks, and decomposition-based hybrid models exhibit promising forecasting accuracy and reliability. Challenges such as data availability, complexity-interpretability trade-offs, and integration difficulties with energy management systems present opportunities for innovative solutions. These include exploring advanced data processing and calibration techniques, leveraging Big Data and IoT advancements, formulating advanced machine learning (ML) techniques, and employing probabilistic approaches with desirable accuracy and robustness in forecasting solar photovoltaic power output (SPVPO) and wind turbine power output (WTPO). Additionally, expanding research to ensure model generalizability across diverse climate conditions and forecasting horizons is crucial for enhancing the reliability and efficiency of renewable energy forecasting using machine learning techniques.
- Research Article
2
- 10.1186/s43067-025-00252-7
- Jul 31, 2025
- Journal of Electrical Systems and Information Technology
- Wei Ba + 3 more
Abstract The economic load dispatch problem of microgrid strives to optimize the allocation of total power demand among generating units under specific constraints. Many optimization techniques have been used to solve this problem in power systems; however, achieving the optimal solution is considered difficult due to the involvement of a nonlinear objective function and large search domain. In order to achieve economic load dispatch more quickly and accurately, a novel economic load dispatch method of microgrid based on hybrid slime mould and genetic algorithm (GSMA) is proposed in this paper. Objective function models and their constraints based on wind, photovoltaic, energy storage and fuel power generation are presented. For the early iterations of the method, crossover and mutation of the genetic algorithm are used to increase the diversity of the population. When the number of iterations reaches the threshold, the slime mould algorithm is used to improve the adaptability to complex objective functions. The velocity matrix is introduced to adjust the direction and speed of the individual movement to enhance the searching ability in GSMA. For performance evaluation, GSMA is compared with slime mould algorithm (SMA), grey wolf optimizer (GWO), sparrow search algorithm (SSA), Harris Hawks optimization (HHO), whale optimization algorithm (WOA) and particle swarm optimization (PSO) using standard optimization functions. The experimental results show that GSMA converges to the optimal solution faster than other algorithms. The algorithms are used for economic load dispatch on the simulation test system. The GSMA spends minimum dispatch cost and achieves the best dispatch results compared to other algorithms. It further demonstrates the effectiveness of the new method in solving the economic load dispatch problem of microgrid.
- Research Article
- 10.1186/s43067-025-00237-6
- Jul 30, 2025
- Journal of Electrical Systems and Information Technology
- Parsa Parsafar
Abstract In the contemporary urban landscape, ensuring the structural health and resilience of buildings and infrastructure is paramount for sustainable development and the well-being of citizens. This paper proposes a novel approach, termed Urban Sentinel, aimed at revolutionizing urban infrastructure management through the integration of Internet of Things (IoT) sensor networks and regression AI systems. This integration is still in its early stages of practical application, marking Urban Sentinel as a significant step forward in urban infrastructure management. Urban Sentinel encompasses a comprehensive system architecture designed to monitor and predict the health of buildings and infrastructure in cities or any other integrated district. Central to this architecture is the deployment of a proposed sensor set, strategically installed within buildings to capture critical data related to structural integrity, environmental conditions, and operational performance. These sensors transmit data using LoRaWAN wireless technology to a centralized management system, where a regression AI model harnesses the power of machine learning algorithms to analyze the data and predict the health status of the buildings. This system offers several advantages over traditional monitoring methods. By leveraging IoT technology, Urban Sentinel enables real-time data collection, allowing for the timely detection of anomalies and potential risks. The integration of regression AI systems enhances the predictive capabilities of the management system, enabling proactive maintenance and optimization of urban infrastructure. Additionally, this paper thoroughly addresses potential challenges and offers corresponding solutions to mitigate them effectively. By embracing innovative technologies and holistic approaches to infrastructure management, Urban Sentinel paves the way for smarter and more resilient cities of the future.
- Research Article
1
- 10.1186/s43067-025-00242-9
- Jul 29, 2025
- Journal of Electrical Systems and Information Technology
- Toyeeb Adekunle Abd’azeez + 1 more
Abstract Rising energy consumption, driven by industrialisation and urbanisation, contributes significantly to climate change and household economic burdens. In response, this study developed a machine learning model to predict household energy consumption in residential settings. The dataset employed comprises timestamps, temperature, humidity, and weather data. Prior to model training, extensive exploratory data analysis, preprocessing, and feature engineering were conducted to maintain data quality and enhance model performance. After comparing different statistical models, including RandomForestRegressor, ExtraTreesRegressor, SupportVectorRegressor, and XGBRegressor algorithms, ExtraTreesRegressor emerged as the optimal model, with an R2 score of 0.7441 and a MAPE of 16.27%. The lower MAPE and the higher R2 score indicate the superiority of the ExtraTreeRegressor over other algorithms. While energy consumption is characterised by high variance, our optimised model effectively interprets interactions between input features and predicts the equivalent energy consumed with a lower RMSE of 11.75. This optimised model was integrated into a web application with an interactive user interface. The application programming interface (API) enabled users to make informed decisions about energy consumption, leading to potential energy savings and reduced environmental impact. The feature importance examined for the prediction process of the model revealed the pivotal role of the hour feature in the energy consumption prediction process. Hence, the time of the day defines various occupancy behaviours that could affect energy consumption.
- Research Article
1
- 10.1186/s43067-025-00243-8
- Jul 28, 2025
- Journal of Electrical Systems and Information Technology
- Amany M Sarhan + 6 more
Abstract Skin cancer, particularly melanoma, poses a critical global health challenge due to its high mortality rate. Early and precise detection is vital for effective treatment and better prognosis. Recent advancements in deep learning have shown significant promise in medical image analysis, including skin cancer classification. This study investigates the automated classification of skin lesions using the HAM10000 dataset, which features high-resolution images across seven distinct classes. We focus on utilizing deep learning, specifically convolutional neural networks (CNNs), to enhance the accuracy of skin lesion classification. Our research examines several CNN architectures, including XceptionNet, DenseNet201, DenseNet169, DenseNet121, MobileNetV2, and GoogleNet, alongside a customized CNN model tailored for skin cancer classification. We incorporate techniques such as data augmentation and transfer learning to further refine model performance. Hyperparameter optimization is achieved using the Ant Colony Optimization algorithm. The proposed models are evaluated on the HAM10000 dataset with standard metrics; accuracy, precision, recall, and F1-score. Our results highlight the effectiveness of deep learning in distinguishing between various skin cancer types attaining values of 96.5%, 97.0%, and 97.0% for accuracy, precision, recall, and F1-score, respectively, showing improvements over existing state-of-the-art methods in both classification accuracy. These findings offer significant implications for dermatology and healthcare by facilitating automated skin cancer classification, potentially aiding dermatologists in early diagnosis and improving patient outcomes. Additionally, this framework provides a foundation for future research in applying deep learning to medical image analysis and healthcare diagnostics.
- Research Article
- 10.1186/s43067-025-00241-w
- Jul 23, 2025
- Journal of Electrical Systems and Information Technology
- Paule Kevin Nembou Kouonchie + 2 more
Abstract With the level of motorization on the rise, road accidents are increasing predominantly in developing countries. Many countries have developed strategies to ensure road safety, but the problem persists. In the case of Kenya, the country recorded 3369 deaths due to road accidents during the first nine months of the year 2024, with pedestrians and motorcyclists being the most affected groups. The government has integrated intelligent transport systems (ITS) to mitigate traffic congestion and, to some extent, prevent accidents, especially in Nairobi. Other countries have proposed vehicle-to-infrastructure (V2I) technology, a subset of ITS, as a better solution to reduce road accidents. The implementation of V2I necessitates having roadside units (RSUs) on the road network, and it is said to be very expensive in terms of deployment, operation, and maintenance costs. RSUs communicate with vehicles equipped with an onboard unit, and the exchange between them must be established with optimal performance by considering the connectivity, the packet delivery ratio, the average downlink end-to-end delay, and the energy consumption of RSUs. This research aims to develop an optimal RSU deployment scheme for urban areas based on artificial intelligence. The objective is to optimally deploy RSUs operating in energy-saving mode using a hybrid genetic algorithm–particle swarm optimization (GA–PSO) technique. The Kilimani–Hurlingham road network, a section of the Nairobi Road network, is used to test this model. The simulation results demonstrate the effectiveness of the hybridization of GA and PSO for the optimal deployment of RSUs with significant communication results concerning connectivity, packet delivery ratio, and average downlink end-to-end delay. Two scenarios of packet exchange with 200 bytes (small packets) and 1024 bytes (large packets) were used for the simulation, and in both, the GA–PSO could obtain the best nodes to allocate the RSUs. For example, looking at the packet delivery ratio, it was possible to obtain up to 62.36% for large packets and 79.08% for small packets. Also, looking at packets’ end-to-end delay, the hybrid GA–PSO could place RSUs such that the maximum end-to-end delay was 3.51 ms for large packets, which is far less than the maximum acceptable delay of 20 ms in the V3I network. The validation of this method was done as an analytical comparison between the results obtained when using GA and PSO individually, with those obtained when using the hybrid GA–PSO. Comparisons showed the superiority of the hybridization of both techniques over using them standalone.