Abstract

Multi-modal context-aware systems can provide user-adaptive services, but it requires complicated recognition models with larger resources. The limitations to build optimal models and infer the context efficiently make it difficult to develop practical context-aware systems. We developed a multi-modal context-aware system with various wearable sensors including accelerometers, gyroscopes, physiological sensors, and data gloves. The system used probabilistic models to handle the uncertain and noisy time-series sensor data. In order to construct the efficient probabilistic models, this paper uses an evolutionary algorithm to model structure and EM algorithm to determine parameters. The trained models are selectively inferred based on a semantic network which describes the semantic relations of the contexts and sensors. Experiments with the real data collected show the usefulness of the proposed method.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.