Abstract

Some content in multimedia resources can depict or evoke certain emotions in users. The aim of Emotional Information Retrieval (EmIR) and of our research is to identify knowledge about emotional-laden documents and to use these findings in a new kind of World Wide Web information service that allows users to search and browse by emotion. Our prototype, called Media EMOtion SEarch (MEMOSE), is largely based on the results of research regarding emotive music pieces, images and videos. In order to index both evoked and depicted emotions in these three media types and to make them searchable, we work with a controlled vocabulary, slide controls to adjust the emotions’ intensities, and broad folksonomies to identify and separate the correct resource-specific emotions. This separation of so-called power tags is based on a tag distribution which follows either an inverse power law (only one emotion was recognized) or an inverse-logistical shape (two or three emotions were recognized). Both distributions are well known in information science. MEMOSE consists of a tool for tagging basic emotions with the help of slide controls, a processing device to separate power tags, a retrieval component consisting of a search interface (for any topic in combination with one or more emotions) and a results screen. The latter shows two separately ranked lists of items for each media type (depicted and felt emotions), displaying thumbnails of resources, ranked by the mean values of intensity. In the evaluation of the MEMOSE prototype, study participants described our EmIR system as an enjoyable Web 2.0 service.

Highlights

  • Certain content in documents may represent or provoke emotions in their users

  • Documents can be found as music, images and movies as well as text The aim of Emotional Information Retrieval (EmIR) is to identify these emotions, to make them searchable, and to utilize them for retrieval in a variety of areas

  • To understand what EmIR and indexing means and to understand what kind of problems it entails, one has to be concerned with the basics of information retrieval

Read more

Summary

Introduction

Certain content in documents may represent or provoke emotions in their users. “Emotive”. In concept-based retrieval, users work with text-based terms such as keywords In principle, these terms could even be extracted automatically from the content. A central problem of concept-based image, music and video retrieval is indexer inconsistency. We do not yet have any experience with intellectual indexing of emotions Such an undertaking on the web exemplifies a practical problem: it is impossible for professional indexers to describe the millions of images, videos and pieces of music available online, most of which are user-generated. Our survey of the research ends with a rather negative result: at present, neither content- nor concept-based retrieval provides useful results for the emotional retrieval of images, videos and music on the internet

Emotional Basics
Emotional Multimedia Indexing and Retrieval
Emotional Image Indexing and Retrieval
Emotional Music Indexing and Retrieval
Emotional Video Indexing and Retrieval
Slide Control Emotional Tagging
MEMOSE Indexing Sub-System
MEMOSE Retrieval Sub-System
Evaluation
Conclusions

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.