Heuristic-guided BSO for efficient task scheduling in IoT-driven fog–cloud environment

  • Abstract
  • Literature Map
  • Similar Papers
Abstract
Translate article icon Translate Article Star icon
Take notes icon Take Notes

ABSTRACT The rapid expansion of IoT devices demands efficient task scheduling in fog-cloud infrastructures. This study presents a Heuristic-Guided Butterfly Swarm Optimisation (BSO) algorithm, integrating the Minimum Completion Time (MCT) heuristic into BSO’s initialization phase to enhance convergence and scheduling quality. A utility function balances processing time and execution cost while ensuring scalability across heterogeneous workloads. Simulations with 40-500 tasks demonstrate superiority over TCaS, MPSO, BLA, and RR algorithms, achieving up to 33.5% faster execution in fog-only settings and 64.89% higher scheduling efficiency in fog-cloud environments with minimal cost overhead. These results confirm the proposed scalable, cost-efficient solution for IoT-driven scheduling.

Similar Papers
  • Research Article
  • 10.1007/s10586-025-05586-5
A hybrid DECSO algorithm for efficient multi objective task scheduling in cloud computing environments
  • Sep 19, 2025
  • Cluster Computing
  • Ali Gunduz + 2 more

One of the biggest problems in the rapidly developing cloud computing field in recent years is efficient task scheduling. Task scheduling in cloud computing is recognized as an NP-complete problem, presenting significant challenges due to the large task sizes and the complexity of efficiently managing diverse computational resources. Task scheduling in cloud computing aims to ensure that tasks are assigned to virtual machines to minimize completion time and maximize resource utilization. To address these challenges, this study introduces a novel hybrid optimization algorithm named Differential Evolution Cat Swarm Optimization (DECSO). Unlike traditional hybrid approaches, DECSO dynamically balances exploration and exploitation, ensuring a more adaptive and efficient task scheduling strategy. DECSO synergizes the global exploration ability and adaptive capabilities of Differential Evolution (DE) with the local search efficiency and explorative and exploitative strengths of Cat Swarm Optimization (CSO). The proposed DECSO algorithm is compared with PSO (Particle Swarm Optimization) and CSO via the CloudSim simulation experiment platform. DECSO’s performance is evaluated using makespan, resource utilization, and migration time, which are critical metrics for efficient cloud task scheduling. The experimental results demonstrate that DECSO achieves up to 22.6% reduction in MakeSpan compared to CSO and 9.6% compared to PSO, 11.9% improvement in resource utilization compared to CSO and 14.7% compared to PSO, and 20.6% reduction in migration time compared to CSO and 11.2% compared to PSO. The results obtained from the simulation studies carried out show that the presented optimization model provides significant improvements in terms of MakeSpan, resource utilization, and migration time.

  • Research Article
  • Cite Count Icon 3
  • 10.1038/s41598-025-99837-5
Modified grey wolf optimization for energy-efficient internet of things task scheduling in fog computing
  • Apr 27, 2025
  • Scientific Reports
  • Deafallah Alsadie + 1 more

Fog-cloud computing has emerged as a transformative paradigm for managing the growing demands of Internet of Things (IoT) applications, where efficient task scheduling is crucial for optimizing system performance. However, existing task scheduling methods often struggle to balance makespan minimization and energy efficiency in dynamic and resource-constrained fog-cloud environments. Addressing this gap, this paper introduces a novel Task Scheduling algorithm based on a modified Grey Wolf Optimization approach (TS-GWO), tailored specifically for IoT requests in fog-cloud systems. The proposed TS-GWO incorporates innovative operators to enhance exploration and exploitation capabilities, enabling the identification of optimal scheduling solutions. Extensive evaluations using both synthetic and real-world datasets, such as NASA Ames iPSC and HPC2N workloads, demonstrate the superior performance of TS-GWO over established metaheuristic methods. Notably, TS-GWO achieves improvements in makespan by up to 46.15% and reductions in energy consumption by up to 28.57%. These results highlight the potential of TS-GWO to effectively address task scheduling challenges in fog-cloud environments, paving the way for its application in broader optimization tasks.

  • Research Article
  • Cite Count Icon 1
  • 10.33889/ijmems.2025.10.2.025
Network Usage and Time Efficient Performance Enhancement Model for IoT Applications in Fog Cloud Computing
  • Apr 1, 2025
  • International Journal of Mathematical, Engineering and Management Sciences
  • Upma Arora + 1 more

Fog computing bridges IoT applications and cloud computing, providing low latency services through local computation and storage. Despite its advantages, challenges such as efficient scheduling and placement of IoT applications on fog cloud nodes hinder its widespread adoption. This manuscript presents a Performance Enhancement Algorithm for scheduling IoT applications in a fog cloud environment. The algorithm comprises four key procedures that schedule application modules across the available infrastructure devices. Merge sort along with the heterogeneous shortest module first strategy is used in fusion as the key to improve the performance. The effectiveness of this proposed algorithm was evaluated using the iFogSim simulator and the results demonstrate significant improvements, with total network usage improvement of 66.31% over HSMF, 88.40% over edge-wards, and 98.66% over the cloud-only method. Also, it improves the execution time significantly in most network configurations. Our research contributes to providing a very reliable means for placing applications in the fog-cloud infrastructure as our fusion algorithm makes it a very stable and scalable for CPU bound tasks in fog and cloud computing environment.

  • Book Chapter
  • Cite Count Icon 18
  • 10.1007/978-981-10-8657-1_22
Multi Objective Task Scheduling Algorithm for Cloud Computing Using Whale Optimization Technique
  • Jan 1, 2018
  • G Narendrababu Reddy + 1 more

The new and emerging IT paradigm, Cloud computing provides different options to customers to compute the tasks’ based on their choice and preference. Cloud systems provide services to customers as a utility. The customers are interested in the availability of service at low cost and minimization of task completion time. The performance of cloud systems depends on efficient scheduling of tasks. When cloud server receives multiple user requests, it is necessary for the service provider to schedule the tasks to the appropriate resources to realize the customer satisfaction. In this paper we propose Multi objective Whale Optimization Algorithm (WOA) to schedule tasks in cloud environment. WOA schedules the tasks based on a fitness parameter. The fitness parameter depends on three major constraints: resource utilization, quality of service and energy. The proposed WOA schedules the tasks based on above three parameters such that the task execution time and cost involved in the execution on virtual machines is minimal. The efficiency of the scheduling algorithm depends on minimum fitness parameter. The experimental results show that proposed WO scheduling algorithm provides superior results when compared with existing algorithms.

  • Research Article
  • Cite Count Icon 15
  • 10.1007/s10586-024-04712-z
An efficient deep reinforcement learning based task scheduler in cloud-fog environment
  • Nov 5, 2024
  • Cluster Computing
  • Prashanth Choppara + 1 more

Efficient task scheduling in cloud and fog computing environments remains a significant challenge due to the diverse nature and critical processing requirements of tasks originating from heterogeneous devices. Traditional scheduling methods often struggle with high latency and inadequate processing times, especially in applications demanding strict computational efficiency. To address these challenges, this paper proposes an advanced fog-cloud integration approach utilizing a deep reinforcement learning-based task scheduler, DRLMOTS (Deep Reinforcement Learning based Multi Objective Task Scheduler in Cloud Fog Environment). This novel scheduler intelligently evaluates task characteristics, such as length and processing capacity, to dynamically allocate computation to either fog nodes or cloud resources. The methodology leverages a Deep Q-Learning Network model and includes extensive simulations using both randomized workloads and real-world Google Jobs Workloads. Comparative analysis demonstrates that DRLMOTS significantly outperforms existing baseline algorithms such as CNN, LSTM, and GGCN, achieving a substantial reduction in makespan by up to 26.80%, 18.84, and 13.83% and decreasing energy consumption by up to 39.60%, 30.29%, and 27.11%. Additionally, the proposed scheduler enhances fault tolerance, showcasing improvements of up to 221.89%, 17.05%, and 11.05% over conventional methods. These results validate the efficiency and robustness of DRLMOTS in optimizing task scheduling in fog-cloud environments.

  • Research Article
  • Cite Count Icon 112
  • 10.1016/j.jksuci.2020.11.002
Heuristic initialization of PSO task scheduling algorithm in cloud computing
  • Nov 13, 2020
  • Journal of King Saud University - Computer and Information Sciences
  • Seema A Alsaidy + 2 more

Heuristic initialization of PSO task scheduling algorithm in cloud computing

  • Research Article
  • Cite Count Icon 26
  • 10.1016/j.inffus.2023.102050
Adversarial Deep Learning based Dampster–Shafer data fusion model for intelligent transportation system
  • Oct 4, 2023
  • Information Fusion
  • Senthil Murugan Nagarajan + 5 more

Adversarial Deep Learning based Dampster–Shafer data fusion model for intelligent transportation system

  • Research Article
  • Cite Count Icon 1
  • 10.1002/cpe.70163
Enhanced Task Scheduling With Metaheuristics for Delay and Energy Optimization in Cloud‐Fog Computing
  • Jun 13, 2025
  • Concurrency and Computation: Practice and Experience
  • Pinky + 1 more

ABSTRACTThe growth of the Internet of Things (IoT) and its application across various industries has produced large volumes of data for processing. Tasks that require prompt responses, particularly delay‐sensitive ones, are directed to the nearest fog nodes. Offloading critical tasks to the cloud reduces user‐side energy consumption but increases latency due to longer transmission distances. Fog nodes, being closer to the source, minimize delay but may require more local energy. Another major issue in cloud‐fog computing is allocating tasks to suitable resources according to task needs. To tackle these challenges, this study introduces a hybrid meta‐heuristic approach by combining the Butterfly Swarm Optimization (BSO) algorithm with the heuristic Minimum Completion Time (MCT) initialization method. The key innovation of this work lies in the integration of MCT‐based heuristic initialization with the BSO algorithm, enabling faster convergence and more efficient task scheduling by balancing energy and delay in heterogeneous cloud‐fog environments. Both delay and energy consumption are reduced through the MCT‐BSO algorithm, in which the fog broker effectively manages the task distribution. Simulation results show that the MCT‐BSO method achieves delay reductions of approximately 20.7% to 36.3% and improvements in energy consumption ranging from 15.4% to 38.1%, significantly outperforming comparative algorithms such as Grey Wolf Optimization, Nondominated Sorting Genetic Algorithm II, and Modified Particle Swarm Optimization, particularly under high workload conditions.

  • Research Article
  • Cite Count Icon 46
  • 10.1016/j.jocs.2022.101828
Hybrid heuristic algorithm for cost-efficient QoS aware task scheduling in fog–cloud environment
  • Aug 19, 2022
  • Journal of Computational Science
  • Syed Mujtiba Hussain + 1 more

Hybrid heuristic algorithm for cost-efficient QoS aware task scheduling in fog–cloud environment

  • PDF Download Icon
  • Research Article
  • Cite Count Icon 5
  • 10.1155/2023/4350615
Multiobjective Prioritized Workflow Scheduling in Cloud Computing Using Cuckoo Search Algorithm
  • Jul 7, 2023
  • Applied Bionics and Biomechanics
  • Babuli Sahu + 3 more

Effective workflow scheduling in cloud computing is still a challenging problem as incoming workflows to cloud console having variable task processing capacities and dependencies as they will arise from various heterogeneous resources. Ineffective scheduling of workflows to virtual resources in cloud environment leads to violations in service level agreements and high energy consumption, which impacts the quality of service of cloud provider. Many existing authors developed workflow scheduling algorithms addressing operational costs and makespan, but still, there is a provision to improve the scheduling process in cloud paradigm as it is an nondeterministic polynomial-hard problem. Therefore, in this research, a task-prioritized multiobjective workflow scheduling algorithm was developed by using cuckoo search algorithm to precisely map incoming workflows onto corresponding virtual resources. Extensive simulations were carried out on workflowsim using randomly generated workflows from simulator. For evaluating the efficacy of our proposed approach, we compared our proposed scheduling algorithm with existing approaches, i.e., Max–Min, first come first serve, minimum completion time, Min–Min, resource allocation security with efficient task scheduling in cloud computing-hybrid machine learning, and Round Robin. Our proposed approach is outperformed by minimizing energy consumption by 15% and reducing service level agreement violations by 22%.

  • Conference Article
  • Cite Count Icon 21
  • 10.1109/icpads.2006.40
Efficient compile-time task scheduling for heterogeneous distributed computing systems
  • Jan 1, 2006
  • M.I Daoud + 1 more

Efficient task scheduling is essential for obtaining high performance in heterogeneous distributed computing systems (or HeDCSs). Because of its key importance, several scheduling algorithms have been proposed in the literature, which are mainly for homogeneous processors. Few scheduling algorithms are developed for HeDCSs. In this paper, we present a novel task scheduling algorithm, called the longest dynamic critical path (LDCP) algorithm, for HeDCSs. The LDCP algorithm is a list-based scheduling algorithm that uses a new attribute to effectively compute the priorities of tasks in HeDCSs. At each scheduling step, the LDCP algorithm selects the task with the highest priority and assigns the selected task to the processor that minimizes its finish execution time using an insertion-based scheduling policy. The LDCP algorithm successfully generates task schedules that outperform, to the best of our knowledge, two of the best scheduling algorithms for HeDCSs.

  • Research Article
  • Cite Count Icon 268
  • 10.1007/s10586-020-03075-5
A novel hybrid antlion optimization algorithm for multi-objective task scheduling problems in cloud computing environments
  • Mar 12, 2020
  • Cluster Computing
  • Laith Abualigah + 1 more

Efficient task scheduling is considered as one of the main critical challenges in cloud computing. Task scheduling is an NP-complete problem, so finding the best solution is challenging, particularly for large task sizes. In the cloud computing environment, several tasks may need to be efficiently scheduled on various virtual machines by minimizing makespan and simultaneously maximizing resource utilization. We present a novel hybrid antlion optimization algorithm with elite-based differential evolution for solving multi-objective task scheduling problems in cloud computing environments. In the proposed method, which we refer to as MALO, the multi-objective nature of the problem derives from the need to simultaneously minimize makespan while maximizing resource utilization. The antlion optimization algorithm was enhanced by utilizing elite-based differential evolution as a local search technique to improve its exploitation ability and to avoid getting trapped in local optima. Two experimental series were conducted on synthetic and real trace datasets using the CloudSim tool kit. The results revealed that MALO outperformed other well-known optimization algorithms. MALO converged faster than the other approaches for larger search spaces, making it suitable for large scheduling problems. Finally, the results were analyzed using statistical t-tests, which showed that MALO obtained a significant improvement in the results.

  • PDF Download Icon
  • Research Article
  • Cite Count Icon 32
  • 10.3390/su15065104
An Intelligent Task Scheduling Model for Hybrid Internet of Things and Cloud Environment for Big Data Applications
  • Mar 14, 2023
  • Sustainability
  • Souvik Pal + 5 more

One of the most significant issues in Internet of Things (IoT) cloud computing is scheduling tasks. Recent developments in IoT-based technologies have led to a meteoric rise in the demand for cloud storage. In order to load the IoT services onto cloud resources efficiently even while satisfying the requirements of the applications, sophisticated planning methodologies are required. This is important because several processes must be well prepared on different virtual machines to maximize resource usage and minimize waiting times. Different IoT application tasks can be difficult to schedule in a cloud-based computing architecture due to the heterogeneous features of IoT. With the rise in IoT sensors and the need to access information quickly and reliably, fog cloud computing is proposed for the integration of fog and cloud networks to meet these demands. One of the most important necessities in a fog cloud setting is efficient task scheduling, as this can help to lessen the time it takes for data to be processed and improve QoS (quality of service). The overall processing time of IoT programs should be kept as short as possible by effectively planning and managing their workloads, taking into account limitations such as task scheduling. Finding the ideal approach is challenging, especially for big data systems, because task scheduling is a complex issue. This research provides a Deep Learning Algorithm for Big data Task Scheduling System (DLA-BDTSS) for the Internet of Things (IoT) and cloud computing applications. When it comes to reducing energy costs and end-to-end delay, an optimized scheduling model based on deep learning is used to analyze and process various tasks. The method employs a multi-objective strategy to shorten the makespan and maximize resource consumption. A regional exploration search technique improves the optimization algorithm’s capacity to exploit data and avoid becoming stuck in local optimization. DLA-BDTSS was compared to other well-known task allocation methods in accurate trace information and the CloudSim tools. The investigation showed that DLA-BDTSS performed better than other well-known algorithms. It converged faster than different approaches, making it beneficial for big data task scheduling scenarios, and it obtained an 8.43 percent improvement in the outcomes. DLA-BDTSS obtained an 8.43% improvement in the outcomes with an execution time of 34 s and fitness value evaluation of 76.8%.

  • Conference Article
  • 10.1109/iconstem.2017.8261249
A novel method for scheduling workflows in cloud computing environment
  • Mar 1, 2017
  • G Narendrababu Reddy + 1 more

The new and emerging IT paradigm, Cloud computing provides different options to customers to compute the tasks' based on their choice and preference. Cloud systems provide services to customers as a utility. The customers are interested in the availability of service at low cost and minimization of task completion time. The performance of cloud systems depends on efficient scheduling of tasks. The groups of tasks which are interdependent are referred as workflows. Workflow tasks scheduling plays an important role in estimating cloud system performance. If we want to minimize the task execution time (make span), the cost involved in it will increase. Here we proposed a novel scheduling method which minimizes the cost and time to schedule tasks of workflows. This algorithm schedules the tasks of a workflow to complete the execution in shortest feasible time so as to minimize the price for the services provided to customers. The experimental results show that proposed method minimizes the makespan and price in scheduling workflows when compared with other existing algorithms.

  • Research Article
  • 10.14741/ijcet/v.15.3.4
An Efficient Bio-Inspired Optimization Framework for Scalable Task Scheduling in Cloud Computing Environments
  • May 15, 2025
  • International Journal of Current Engineering and Technology
  • Gopikrishna Maddali

The need for sharing and using resources is growing at a fast pace, which poses several issues for cloud computing (CC) as the number of users increases. For this reason, job scheduling with load balancing across resources is a crucial area for improving performance. High energy usage and underutilised resources are two of the major obstacles to effective task scheduling. To address this, propose a bio-inspired optimization framework utilizing the Lyrebird Falcon Optimization (LFO) algorithm, which mimics lyrebird behavior through two key phases: escaping (exploration) and hiding (exploitation). This population-based metaheuristic dynamically updates task assignments to minimize makespan and energy usage while enhancing CPU and resource utilization. The algorithm was implemented in CloudSim and evaluated across various task loads (1000–5000 tasks). Experimental results demonstrate that LFO consistently achieves lower makespan (from 22.13s to 18.78s) and energy consumption (from 21.67 kW to 23.70 kW) compared to the traditional Fruit Fly Optimization Algorithm (FOA), highlighting its efficiency. The key advantages of this work include its ability to minimize energy consumption while optimizing resource utilization, scalability to large-scale cloud environments, and improved performance, making it a promising solution for sustainable and efficient task scheduling in cloud computing.

Save Icon
Up Arrow
Open/Close
  • Ask R Discovery Star icon
  • Chat PDF Star icon

AI summaries and top papers from 250M+ research sources.