Human-machine interfaces have evolved, benefiting from the growing access to devices with superior, embedded signal-processing capabilities, as well as through new sensors that allow the estimation of movements and gestures, resulting in increasingly intuitive interfaces. In this context, sensor fusion for the estimation of the spatial orientation of body segments allows to achieve more robust solutions, overcoming specific disadvantages derived from the use of isolated sensors, such as the sensitivity of magnetic-field sensors to external influences, when used in uncontrolled environments. In this work, a method for the combination of image-processing data and angular-velocity registers from a 3D MEMS gyroscope, through a Discrete-time Kalman Filter, is proposed and deployed as an alternate user interface for mobile devices, in which an on-screen pointer is controlled with head movements. Results concerning general performance of the method are presented, as well as a comparative analysis, under a dedicated test application, with results from a previous version of this system, in which the relative-orientation information was acquired directly from MEMS sensors (3D magnetometer-accelerometer). These results show an improved response for this new version of the pointer, both in terms of precision and response time, while keeping many of the benefits that were highlighted for its predecessor, giving place to a complementary method for signal acquisition that can be used as an alternative-input device, as well as for accessibility solutions.
Read full abstract