The quality of multimedia communicated through the Internet is highly sensitive to packet loss. In this letter, we develop a time-series prediction model for the end-to-end packet loss rate (PLR). The estimate of the PLR is needed in several transmission control mechanisms such as the TCP-friendly congestion control mechanism for UDP traffic. In addition, it is needed to estimate the amount of redundancy for the forward error correction (FEC) mechanism. An accurate prediction would therefore be very valuable. We used a relatively novel prediction model called sparse basis prediction model. It is an adaptive nonlinear prediction approach, whereby a very large dictionary of possible inputs are extracted from the time series (for example, through moving averages, some nonlinear transformations, etc.). Only few of the very best inputs among the dictionary are selected and are combined linearly. An algorithm adaptively updates the input selection (as well as updates the weights) each time a new time sample arrives in a computationally efficient way. Simulation experiments indicate significantly better prediction performance for the sparse basis approach, as compared to other traditional nonlinear approaches.