Abstract
Identifying and localizing the user's visual attention can enable various intelligent service computing paradigms in a mobile environment. However, existing solutions can only compute the gaze direction, but without the distance to the intended target. In addition, most of them rely on eye tracker or similar infrastructure support. This paper explores the possibility of using portal mobile devices, e.g., smartphone, to detect the visual attention of a user. i-VALS only requires the user to do one simple action to localize the intended object: gazing at the intended object and holding up the smartphone so that the object and the user's face can be simultaneously captured by the front and rear cameras. We develop efficient algorithms to obtain both the distance between the camera and user, the user's gaze direction and the object's direction from the camera. The object's location can then be computed by solving a trigonometric problem. i-VALS has been prototyped on commercial off-the-shelf (COTS) devices. The extensive experiment results show that i-VALS achieves high accuracy and small latency, effectively supporting a large variety of applications in smart environments.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.