Abstract

In this paper, we study the mean delay and maximum buffer requirements at different levels of burstiness for highly bursty data traffic in an ATM node. This performance study is done via an event-driven simulation program which considers both real-time and data traffic. We assume that data traffic is loss-sensitive. A large buffer (fat bucket) is allocated to data traffic to accommodate sudden long burst of cells. Real-time traffic is delay-sensitive. We impose input traffic shaping on real-time traffic using a leaky-bucket based input rate control method. Channel capacity is allocated based on the average arrival rate of each input source to maximize the utilization of channel capacity. Simulation results show that both the maximum buffer requirements and mean node delay for data traffic are directly proportional to the burstiness of its input traffic. Results for mean node delay and cell loss probability of real-time traffic are also analyzed. The simulation program is written in C++ and has been verified using the zero mean statistics concept by comparing simulation results to known theoretical or observed results.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.