Abstract

Traditional Chinese painting (TCP), culturally significant, reflects China’s rich history and aesthetics. In recent years, TCP classification has shown impressive performance, but obtaining accurate annotations for these tasks is time-consuming and expensive, involving professional art experts. To address this challenge, we present a semi-supervised learning (SSL) method for traditional painting classification, achieving exceptional results even with a limited number of labels. To improve global representation learning, we employ the self-attention-based MobileVit model as the backbone network. Furthermore, We present a data augmentation strategy, Random Brushwork Augment (RBA), which integrates brushwork to enhance the performance. Comparative experiments confirm the effectiveness of TCP-RBA in Chinese painting classification, demonstrating outstanding accuracy of 88.27% on the test dataset, even with only 10 labels, each representing a single class.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call