Abstract

This paper investigates the fusion of wearable and ambient sensors for recognizing activities of daily living in a smart home setting using ontology. The proposed approach exploits the advantages of both types of sensing to resolve uncertainties due to missing sensor data. The resulting system is able to infer activities which cannot be inferred with the single type of sensing only. The methodology of ontological modeling the wearable and ambient sensors and the fusion of contexts captured from the sensors, as well as corresponding activity is investigated and described. The proposed system is compared with a system that uses ambient sensors without wearable sensor on the internally collected and publicly available datasets. The results of the experiments show that the proposed system is more robust in handling uncertainties. It is also more capable of inferring additional information about activities, which is not possible with environment sensing only, with overall recognition accuracy of 91.5 and 90% on internal and public datasets, respectively.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call