Abstract
We describe the use of transformations of Gaussian Process (GP) priors to improve the context sensing capability of a system composed of a Kinect sensor and mobile inertial sensors. The Bayesian nonparametric model provides a principled mechanism for incorporating the low-sampling-rate position measurements and the high-sampling-rate derivatives in multi-rate sensor fusion which takes account of the uncertainty of each sensor type. The complementary properties of these sensors enable the GP model to calculate the likelihood of the observed Kinect skeletons and inertial data to identify individual users.We conducted three experiments to test the performance of the proposed GP model: (1) subtle hand movements, (2) walking with a mobile device in the trouser pocket, and (3) walking with a mobile device held in the hand. We compared the GP with the direct acceleration comparison method. Experimental results show that the GP approach can achieve successful matches (with mean accuracy µ 90 % ) in all 3 contexts, including when there are only subtle hand movements, where the acceleration comparison method performs poorly ( µ < 20 % ) . HighlightsWe explore the complementary properties of Kinect sensor and inertial sensors.We describe a GP model to improve the context sensing in proxemic interactions.The model incorporates positions and accelerations in multi-rate sensor fusion.We identify users by matching Kinect skeletons with the sensed data from devices.User matching and identification through people's everyday movements is feasible.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.