Abstract

AbstractWe model the goodput of a single TCP source on a wireless link experiencing sudden increases in Round Trip Time (RTT), that is delay spikes. Such spikes trigger spurious timeouts that reduce the TCP goodput. Renewal reward theory is used to derive a straightforward expression for TCP goodput that takes into account limited sending rates (limited window size), lost packets due to congestion and the delay spike properties such as the average spike duration and distribution of the spike intervals. The basic model is for independent and identically distributed (i.i.d.) spike intervals, and correlated spike intervals are modelled by using a modulating background Markov chain. Validation by ns2 simulations shows excellent agreement for lossless scenarios and good accuracy for moderate loss scenarios (for packet loss probabilities less than 5%). Numerical studies have also been performed to assess the impact of different spike interval distributions on TCP performance. Copyright © 2007 John Wiley & Sons, Ltd.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.