Abstract

During the long-term preservation process of ancient Chinese paintings, there are various degrees of damage. Manual repair methods are inefficient, professional, and prone to secondary damage. Aiming at the above problems, a UGAN ancient painting restoration model based on an improved Generative Adversarial Network (GAN) is proposed. The model uses GAN as the overall framework. The generator in the framework adopts a U-shaped network (U-Net), and adds a dilated convolution-gated residual block (DCGR-Block) in the middle, which enhances the model’s ability to extract the shallow and deep feature information of the image. In the experiment, the test results are compared with the current mainstream methods. The comparison results show that when repairing the overall deletion, the peak signal-to-noise ratio (SPNR) and structural similarity (SSIM) of UGAN are improved by 8.14% and 4.79% on average compared with the global and local consistency image inpainting (GLCIC). When the rate is higher, the performance of the network model in this paper is also better than that of similar mainstream algorithms.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call