Head pointing is widely used for hands-free input in head-mounted displays (HMDs). The primary role of head movement in an HMD is to control the viewport based on absolute mapping of head rotation to the 3D environment. Head pointing is conventionally supported by the same 1:1 mapping of input with a cursor fixed in the centre of the view, but this requires exaggerated head movement and limits input granularity. In this work, we propose to adopt dynamic gain to improve ergonomics and precision, and introduce the HeadShift technique. The design of HeadShift is grounded in natural eye-head coordination to manage control of the viewport and the cursor at different speeds. We evaluated HeadShift in a Fitts’ Law experiment and on three different applications in VR, finding the technique to reduce error rate and effort. The findings are significant as they show that gain can be adopted effectively for head pointing while ensuring that the cursor is maintained within a comfortable eye-in-head viewing range.