Abstract

Over the last few years, the Web-based services, more specifically different types of E-Commerce applications, have become quite popular, resulting in exponential growth in the Web traffic. In many situations, this has led to unacceptable response times and unavailability of services, thereby driving away customers. Many companies are trying to address this problem using multiple Web servers with a front-end load balancer. Load balancing has been found to provide an effective and scalable way of managing the ever-increasing Web traffic. However, there has been little attempt to analyze the performance characteristics of a system that uses a load balancer. This paper presents a queuing model for analyzing load balancing with two Web servers. We first analyze the centralized load balancing model, derive the average response time and the rejection rate, and compare three different routing policies at the load balancer. We then extend our analysis to the distributed load balancing and find the optimal routing policy that minimizes the average response time.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call