Abstract

Impression-based music retrieval helps users to find musical pieces that suit their preferences, feelings, or mental states from among a huge volume of a music database. Users are asked to select one or more pairs of impression words from among multiple pairs that are presented by the system and to estimate each selected pair on a seven-step scale to input their impressions into the system. For instance, if they want to locate musical pieces that will create a happy impression, they should check the radio button ‘‘Happy’’ in the impression scale: Very happy–Happy–A little happy–Neutral–A little sad–Sad–Very sad. A pair of impression words with a seven-step scale is called an impression scale in this paper. The system calculates the distance between the impressions of each musical piece in a user-specified music database and the impressions that are input by the user. Subsequently, it selects candidate musical pieces to be presented as retrieval results. The impressions of musical pieces are expressed numerically by vectors that are generated from a musical piece’s pitch, strength, and length of every tone using n-gram statistics.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call