Abstract

The detection of face-screen distance on smartphone (i.e., the distance between the user face and the smartphone screen) is of paramount importance for many mobile applications, including dynamic adjustment of screen on-off, screen resolution, screen luminance, font size, with the purposes of power saving, protection of human eyesight, etc. Existing detection techniques for face-screen distance depend on external or internal hardware, e.g., an accessory plug-in sensor (e.g., infrared or ultrasonic sensors) to measure the face-screen distance, a built-in proximity sensor that usually outputs a coarse-grained, two-valued, proximity index (for the purpose of powering on/off the screen), etc. In this paper, we present a fine-grained detection method, called Look Into My Eyes (LIME), that utilizes the front camera and inertial accelerometer of the smartphone to estimate the face-screen distance. Specifically, LIME captures the photo of the user's face only when the accelerometer detects certain motion patterns of mobile phones, and then estimates the face-screen distance by looking at the distance between the user's eyes. Besides, LIME is able to take care of the user experience when multiple users are facing the phone screen. The experimental results show that LIME can achieve a mean squared error smaller than 2.4 cm in all of experimented scenarios, and it incurs a small cost on battery life when integrated into an SMS application for enabling dynamic font size by detecting the face-screen distance.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call