Abstract

360-degree video has been popular due to the immersive experience it provides to the viewer. While watching, viewer can control the field of view (FoV)11In this paper, we use viewport interchangeably with FoV in the range of 360° by 180°. As this trend continues, adaptive bitrate (ABR) streaming is becoming a prevalent issue. Most existing ABR algorithms for 360 videos (360 ABR algorithms) require real-time head traces and certain computation resource from the client for streaming, which largely constrains the range of audience. Also, while more 360 ABR algorithms rely upon machine learning (ML) for viewport prediction, ML and ABR are research topics that grow mostly independently. In this paper, we propose a two-fold ABR algorithm for 360 video streaming that utilizes 1) an off-the-shelf ABR algorithm for ordinary videos, and 2) an off-the-shelf viewport prediction model. Our algorithm requires neither real-time head traces nor additional computation from the viewing device. In addition, it adapts easily to the newest developments in viewport prediction and ABR. As a consequence, the proposed method fits nicely to the existing streaming framework and any advancement in viewport prediction and ABR could enhance its performance. With the quantitative experiments, we demonstrate that the proposed method achieves twice the quality of experience (QoE) compared to the baseline.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call