Abstract
Research and services on automatic music classification and recommendation have been active in recent years. Here, it is often unclear what kind of metadata and acoustic features strongly contribute to the feasibility of music classification and recommendation. Based on this discussion, we are working on the visualization of music pieces with metadata, acoustic features, machine learning methods, and visualization methods that are effective for music classification tasks, exploring whether new relationships between acoustic features and metadata can be discovered through visualization. Specifically, we calculated the acoustic features of a set of songs using music analysis tools and machine learning techniques, and visualized the distribution of the acoustic features and metadata. In this paper, we present the experimental results visualizing the relationship between acoustic features and metadata including released year, composer name, and artist name.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.