Abstract

With the rise of portable wearable devices, it is easier for users to save their lifelog data. As lifelog is usually disorganized with multi-modal information (even noisy sometimes), an interactive search engine is crucial for users to review and explore their lifelog. Unlike traditional search engines, lifelog search includes multi-modality information of images, text and other data from sensors, which brings challenges to data arrangement and search. Accordingly, users' information need is also multi-level. Hence, a single interaction mechanism may not be able to satisfy users' requirements. As the data set is highly personalized, interaction and feedback from users should also be considered in the search engine. Therefore, in this paper we present an interactive multi-modality lifelog search engine to help users manage and find lifelog data. To this end, lifelog data is clustered and processed in multi-level processing. Then, we build an interactive search engine, includingtext as query, image as query, andtimeline view modules. Besides, the system is able to adopt user feedback mechanisms in multi-round queries. Our system shows promising experimental results on LSC'20 dataset and development topics. The text-based search module gives correct results on more than 60% of the development topics at LSC'20.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.