Abstract
Currently optical networks have been employed to meet the ever-increasing data transfer demands of grid applications and thus give rise to the concept of an “optical grid”. Task scheduling is an important issue for an optical grid, for it optimally allocates both grid and optical network resources to accelerate application execution and increase the resource utilization ratio. However, most task scheduling algorithms based on theoretical models may generate accuracy deviations between the scheduled results and the actual finish time of the applications. Accuracy deviations may lead to inefficient resources utilization and unsatisfied Quality of Service (QoS). This paper aims to improve the accuracy of task scheduling algorithms in optical grid environments. We first propose the theoretical task scheduling algorithm and demonstrate that the scheduling result is deviated with actual finish time in the real optical grid environment. Then, we reveal several factors which are likely to influence scheduling accuracy and develop a realistic task scheduling algorithm. We evaluate the theoretical and realistic task scheduling algorithms in our optical grid testbed. The experimental result shows the scheduling accuracy can be improved significantly by the realistic task scheduling algorithm.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.