Abstract

On-demand streaming from a remote server through best-effort Internet poses several challenges because of network losses and variable delays. The primary technique used to improve the quality of distributed content service is replication. In the context of the Internet, Web caching is the traditional mechanism that is used. In this article we develop a new staged delivery model for a distributed architecture in which video is streamed from remote servers to edge caches where the video is buffered and then streamed to the client through a last-mile connection. The model uses a novel revolving indexed cache buffer management mechanism at the edge cache and employs selective retransmissions of lost packets between the remote and edge cache for a best-effort recovery of the losses. The new Web cache buffer management scheme includes a dynamic adjustment of cache buffer parameters based on network conditions. In addition, performance of buffer management and retransmission policies at the edge cache is modeled and assessed using a probabilistic analysis of the streaming process as well as system simulations. The influence of different endogenous control parameters on the quality of stream received by the client is studied. Calibration curves on the QoS metrics for different network conditions have been obtained using simulations. Edge cache management can be done using these calibration curves. ISPs can make use of calibration curves to set the values of the endogenous control parameters for specific QoS in real-time streaming operations based on network conditions. A methodology to benchmark transmission characteristics using real-time traffic data is developed to enable effective decision making on edge cache buffer allocation and management strategies.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call