Abstract

We discuss in this work the opportunity of employing lifelogging devices, applications, and systems, such as systems that collect, process, and store video using mobile and wearable cameras, in order to run queries about objects and concepts of interest from everyday life. The outcome is an instance of opportunistic mobile which we implement with lifelogging technology, mobile video cameras, and camera glasses. We describe the implementation of our concept that builds on top of Life-Tags, a wearable system for abstracting life in the form of clouds of concepts automatically extracted from videos captured by lifeloggers. We show how Life-Tags can be extended with a mobile application and cloud-based services, the Firebase Realtime Database and Cloud Storage, toward integrated lifelogging and mobile crowdsensing, where the life tags of mobile and wearable users are queries for potential matches regarding specific objects and concepts of interest. We conclude with implications for the future integration of lifelogging technology, mobile and wearable computing, and crowdsensing.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.