Abstract

Character input in smartphones, such as erroneous input due to the small size of each button or the small size of the information presentation screen due to the large size of the input screen, has been considered a problem. To solve these issues, in our previous research, a multi-choice input method with a low degree of freedom was designed, which could input up to sixty-four characters by flicking four buttons in two directions: up and down and repeating it twice. In addition, it was possible to input not only by touch but also by gaze and gestures, thereby proposing four alternatives and a method of dividing these alternatives into two and inputting twice. In this research, the aim is to develop a multi-choice input interface with a low degree of freedom, in which four actions are taken by moving the two corners of the device closer and far from the user’s body, and eight actions are performed by tapping the back of the device before performing these actions. The three-axis acceleration sensor of the smartphone was used to determine the movement. Based on the analysis of the four movements, it was suggested that it was possible to determine whether the movement was approaching or moving away by the acceleration sensor’s z-axis or y-axis and whether to tilt right or left from the camera depending on the phase of the x-axis and y-axis. Accordingly, we could determine with 100% accuracy for the three movements and 95% accuracy for the remaining movement.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.