Abstract

In this paper, we investigate the possibilities for augmenting interaction around the mobile device, with the aim of enabling input techniques that do not rely on typical touch-based gestures. The presented research focuses on utilizing a built-in magnetic field sensor, whose readouts are intentionally affected by moving a strong permanent magnet around a smartphone device. Different approaches for supporting magnet-based Around-Device Interaction are applied, including magnetic field fingerprinting, curve-fitting modeling, and machine learning. We implemented the corresponding proof-of-concept applications that incorporate magnet-based interaction. Namely, text entry is achieved by discrete positioning of the magnet within a keyboard mockup, and free-move pointing is enabled by monitoring the magnet’s continuous movement in real-time. The related solutions successfully expand both the interaction language and the interaction space in front of the device without altering its hardware or involving sophisticated peripherals. A controlled experiment was conducted to evaluate the provided text entry method initially. The obtained results were promising (text entry speed of nine words per minute) and served as a motivation for implementing new interaction modalities. The use of neural networks has shown to be a better approach than curve fitting to support free-move pointing. We demonstrate how neural networks with a very small number of input parameters can be used to provide highly usable pointing with an acceptable level of error (mean absolute error of 3 mm for pointer position on the smartphone display).

Highlights

  • Direct touch and touch gestures still represent predominant interaction modalities for contemporary mobile devices

  • Each video file corresponds to a specific part of the research: Magnet-based text entry, free-move pointing along the straight line, and free-move pointing based on the neural networks (NNs) approach

  • We demonstrated how the interaction space around the mobile device could be successfully augmented by intentional distortion of the ambient magnetic field

Read more

Summary

Introduction

Direct touch (tapping) and touch gestures (dragging and swiping) still represent predominant interaction modalities for contemporary mobile devices. Touch-based interaction, on the other hand, comes with specific drawbacks due to the limited form factor of touchscreen devices and the effects of the well-known fat finger syndrome. Like smartwatches, have even more limited input capabilities; the suitability of touch-based interaction can be questioned in such a context as well. An accelerometer can be used to detect smartphone tilting, and the corresponding movements (pitch, roll, and yaw) can be interpreted as input commands. This means that specific tasks, such as adjusting the screen orientation, navigating the user interface, or controlling a mobile game, can be done without touching the device display

Objectives
Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call