Abstract
As a kind of cultural art, calligraphy and painting are not only an important part of traditional culture but also has important value of art collection and trade. The existence of forgeries has seriously affected the fair trade, protection, and inheritance of calligraphy and painting. There is an urgent need for the efficient, accurate, and intelligent technical identification method. By combining the advantages of material attribute recognition and imaging detection of hyperspectral imaging technology with the powerful feature expression ability and classification ability of the convolutional neural network, it can greatly improve the comprehension efficiency of calligraphy and painting identification; meanwhile, in order to reduce the redundancy and the amount of parameters in the method of directly using the hyperspectral image, an objective convex dimensionality reduction method should be used for compressing the original hyperspectral image before deep learning. Based on this, we propose a kind of deep learning method to classify author and authenticity based on the multichannel images obtained by minimum noise fraction (MNF) dimensionality reduction to calligraphy and painting hyperspectral data, and its core is the 2D-CNN or 3D-CNN model with the basic network of "4 convolution layers + 4 pooling layers + 2 full-link layers." The experimental results show that the identification accuracy of the 2D-CNN calligraphy and painting identification with MNF pseudocolor image mosaic as input and the 2D-CNN calligraphy and painting identification with multichannel MNF dimensionality reduced images direct as input have high accuracy, while the 3D-CNN calligraphy and painting identification with multichannel MNF dimensionality reduced images direct as input not only maintains excellent identification accuracy but also has better learning convergence (step number) and stability compared with the 2D-CNN model. Especially, the 3D-CNN identification accuracy of calligraphy and painting's author and authenticity on the test set can reach 93.2% and 95.2%, respectively.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.