In this paper, we propose an emotion-based music classification system using features from physiological signals. The proposed system integrates two functions; the first uses physiological sensors to recognize the emotions of users listening to music, and the second classifies music according to the feelings evoked in the listeners, without using physiological sensors. Moreover, to directly predict the user’s emotions from sensor data acquired through wearable physiological sensors, we developed and implemented a hierarchical inner attention-mechanism-based deep neural network. To relieve the discomfort of users wearing physiological sensors every time to receive content recommendations, the relation between emotion-specific features that are extracted from previously generated physiological signals, and musical features that are extracted from music is learned through a regression neural network. Based on these models, the proposed system classifies input music automatically according to users’ emotional reactions without measuring human physiological signals. The experimental results not only demonstrate the accuracy of the proposed automatic music classification framework, but also provide a new perspective in which human experience-based characteristics related to emotion are applied to artificial-intelligence-based content classification.