Abstract
The innovations in the realm of music signal processing insist on adept music information retrieval (MIR) techniques. In this paper, we propose a query by humming (QBH) MIR system for retrieving the desired song based on enhanced perceptual feature set, context drift information (CDI) and humming query (HQ). In the proposed system, eight perceptual features corresponding to four perceptual properties are extracted. Subsequently, the CDI of these features is analysed and estimated through the perceptual transfer entropy (PTE) measure. Then, the perceptual feature space and PTE trajectory of the target music database is matched with the HQ. The effectiveness of the proposed approach is substantiated with series of experiments, consisting of 1,200 songs target database and 200 HQs. The target database of 1,200 songs is converted into 1,495 fragments by splitting each song into several small proportions. The results show that the proposed method effectively finds the target song with HQ as input.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have