In recent years, we have witnessed a boom in virtual reality (VR). 21 million wearable VR headsets are projected to be shipped in 2017, resulting in $4.9 billion revenue [3]. Among all the options, the mobile VR empowered by phones is most popular, contributing 98% of the sales [1]. Despite at early stage, it appeals to the general public with low cost (~$100) and excellent convenience (no wiring). Mobile VR aims to offer users ubiquitous and high-fidelity experiences. If achieved, users can access VR "anytime, anywhere", regardless of whether they roam or remain static. They also receive smooth, high-resolution panorama views throughout VR experience. It thus demands high bandwidth and stringent end-to-end latency to synchronize the graphical displays with the user motions. A promising approach to enabling ubiquitous mobile VR is the edge-based scheme over 4G LTE networks. As shown in Figure 1, the VR headset reports sensory user motions to edge servers through the LTE network. The edge servers accept user input and deliver the requested graphics. They thus offload computation-intensive processing tasks from the battery-powered user devices. Ubiquitous access is provided by the LTE network, the only large-scale wireless infrastructure offering universal coverage and seamless mobility. In this work [2], we examine several common perceptions, and study medium-quality mobile VR (60 frames per second and 1080p resolution) over operational LTE networks. We show that, contrary to common understandings, bandwidth tends to be not the main bottleneck for medium-quality VR. Instead, network latency poses the biggest obstacle for the mobile VR. A bulk portion of network latency does not stem from wireless data transfer, but from LTE signaling operations to facilitate wireless data delivery. These operations exhibit two categories of latency deficiency: (1) Inter-protocol incoordination, in which problematic interplays between protocols unnecessarily incur delays; (2) Single-protocol overhead, in which each protocol's signaling actions unavoidably incur delays. Our analysis, together with 8-month empirical studies over 4 US mobile carriers, looks into five common beliefs on LTE network latency under both static and mobile scenarios and shows that they are wrong. In fact, they pose as roadblocks to enable mobile VR. Our three findings are centered on three existing mechanisms for data-plane signaling, which are all well known in the literature. However, their deficiencies have not been studied from the latency perspective, particularly for delay-sensitive mobile VR applications. We further describe a new finding that incurs long latency but has not been reported in the literature. Moreover, we quantify the impact of each finding under VR traffic. We devise LTE-VR, a device-side solution to mobile VR without changing hardware or infrastructure. It adapts the signaling operations, while being standard compliant. It reactively mitigates unnecessary latency among protocols and proactively masks unavoidable latency inside each protocol. It exploits two ideas. First, it applies cross-layer design to ensure fast loss detection and recovery and minimize duplicates during handover. Second, it leverages rich side-channel info only available at the device to reduce the latency. We have prototyped LTE-VR with USRP and OpenAirInterface. Our evaluation shows that, LTE-VR reduces the frequency of frames that miss the human tolerance by 3.7× on average. It meets the delay tolerance with 95% probability, which approximates the lower bounds. It also achieves latency reduction comparable to 10× wireless bandwidth expansion. Furthermore, LTE-VR incurs marginal signaling overhead (5% more messages) and extra resource (0.1% more bandwidth and 2.3% more radio grants). We further note that our findings would carry over to the upcoming 5G. LTE-VR is as well applicable to 5G scenarios. It complements the proposed 5G radio, while provides hints for 5G signaling design.