Abstract

The teaching of ideological and political theory courses and daily ideological and political education are two important parts of education for college students. With the iterative update of information technology, the individualized development of students, and the reform and innovation of ideological and political education, higher goals and requirements have been put forward for ideological and political education. Some universities have developed new paths in the teaching model, but they have not considered the evaluation module and paid little attention to their own development. They only paid attention to the fact that it injected fresh blood into the reform of education model and ideological education but ignored the improvement of their own quality. Therefore, with these limitations, the learning effect is not satisfactory. Keeping in view these issues, this article defines the concept of deep learning and ideological and political education of college students as the starting point and then analyzes the new precise and personalized concepts, new forms of intelligent teaching and evaluation, and new models of intelligent learning that deep learning brings to college students’ ideological and political education. This is a new path of intelligent linkage with the subject, object, and mediator. It can deepen the reform of the education and teaching mode of individualization, accuracy, interactivity, and vividness of college students’ ideological and political education and improve the evaluation and management of college students’ ideological and political education. The experimental results of the study showed the effectiveness of the proposed study.

Highlights

  • Music is an abstract art that uses sound as a means of expression to reflect human emotions in real life [1]

  • To classify a Scientific Programming large digital music resource library, if manual labeling is used, it will consume a lot of manpower and time, and the labeling results are more subjective, and the labeling standards cannot be completely unified, which is limited by different professionals who label music. erefore, the automatic classification of music [5,6,7] has gradually become a research hotspot for researchers. e automatic classification of music genres can effectively solve the problem of high cost and time-consuming human labeling. rough the algorithm, a unified classification standard can be formulated, and the algorithm can be continuously optimized, and a highly accurate and objective classification result can be obtained

  • For the recognition and classification of musical instruments, this study proposes a Chinese traditional musical instrument recognition and classification algorithm based on the deep belief network in deep learning. e deep belief network is used in the feature extraction task of traditional Chinese musical instrument music [21], which reduces the work of manual extraction and identification of features

Read more

Summary

Introduction

Music is an abstract art that uses sound as a means of expression to reflect human emotions in real life [1]. Music genre classification is an important research content in the field of music information extraction. The classification of music genres mostly used manual labeling methods. With the continuous emergence of music creation and online uploads, the digital music resource library on the Internet has become increasingly large, and manual labeling methods have gradually failed to meet the needs. E automatic classification of music genres can effectively solve the problem of high cost and time-consuming human labeling. It originated from the amateur music of poor black slaves in the south of the United States in the past. It had no accompaniment, but a solo singing with emotional content, and later combined with the European chord structure to form music of singing and guitar alternately. Many types of musical instruments are used in classical music, including woodwind, brass, percussion, keyboard, bowed, and plucked stringed instruments

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call