We describe a facial feature tracker based on the combined range and amplitude data provided by a 3D Time-Of-Flight camera. We use this tracker to implement a head mouse, an alternative input device for people who have limited use of their hands. The facial feature tracker is based on geometric features that are related to the intrinsic dimensionality of multidimensional signals. We show how the position of the nose in the image can be determined robustly using a very simple bounding-box classifier, trained on a set of labelled sample images. Despite its simplicity, the classifier generalises well to subjects that it was not trained on. An important result is that the combination of range and amplitude data dramatically improves robustness compared to a single type of data. The tracker runs in real time at around 30 frames per second. We demonstrate its potential as an input device by using it to control Dasher, an alternative text input tool.