Abstract
Wearable devices and smart sensors are increasingly adopted to monitor the behaviors of human and artificial agents. Many applications rely on the capability of such devices to recognize daily life activities performed by the monitored users in order to tailor their behaviors with respect to the occurring situations. Despite the constant evolution of smart sensing technologies and the numerous research in this field, an accurate recognition of in-the-wild situations still represents an open research challenge. This work proposes a novel approach for situation identification capable of recognizing the activities and the situations in which they occur in different environments and behavioral contexts, processing data acquired by wearable and environmental sensors.An architecture of a situation-aware wearable computing system is proposed, inspired by Endsley’s situation-awareness model, consisting of a two-step approach for situation identification. The approach first identifies the daily life activities via a learning-based technique. Simultaneously, the context in which the activities are performed is recognized using Context Space Theory. Finally, the fusion between the context state and the activities allows identifying the complex situations in which the user is acting. The knowledge regarding the situations forms the basis on which novel and smarter applications can be realized.The approach has been evaluated on the ExtraSensory public dataset and compared with state-of-the-art techniques, achieving an accuracy of 96% for the recognition of situations and with significantly low computational time, demonstrating the efficacy of the two-step situation identification approach.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.