Abstract

We propose an intuitive image-to-music retrieval (IMR) framework to improve the user experience on these platforms. The proposed method extracts mood and theme tags by searching for images from a pre-built database that are similar to a query image and then retrieves music with matching tag information. We investigated the system’s effectiveness by comparing participants’ satisfaction, intention to use, and valence between those who interacted with the system and those who did not. We also examined whether using mood or theme attributes affected the user-perceived suitability of the retrieved music. Results showed that all three variables of the interaction group were significantly higher than that of the non-interaction group and that there was no difference in the perceived suitability of music between the mood and theme attributes. Our study concludes that image attributes are effective in successful music retrieval and that interaction is a crucial factor in designing IMR systems.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call