Abstract

AbstractThis paper presents an interactive three‐dimensional (3D) touch and gesture user interface that utilizes a light‐field display and a 3D sensor. In contrast to conventional gesture interfaces, in which usually the image space and gesture space are separated, a user can directly interact with the 3D image in this interface system. To achieve this trait, the 3D registration between the sensor and display coordinate systems is necessary. In this study, we propose a method to automatically match the position of the reconstructed 3D image with the position of the hand of the user based on scattered light detection. The proposed calibration process is automatic and enables self‐calibration during practical operation without laborious measurement work. The method allows us to use the gesture detection capabilities of the leap motion controller to develop an interface where content and gesture positions match. We show some examples of 3D touch and gesture interactions in which the gesture position and the generated content match on a light field display.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call