Abstract

Historical Chinese character recognition faces problems including low image quality and lack of labeled training samples. We propose a generative adversarial network (GAN) based transfer learning method to ease these problems. The proposed TH-GAN architecture includes a discriminator and a generator. The network structure of the discriminator is based on a convolutional neural network (CNN). Inspired by Wasserstein GAN, the loss function of the discriminator aims to measure the probabilistic distribution distance of the generated images and the target images. The network structure of the generator is a CNN based encoder-decoder. The loss function of the generator aims to minimize the distribution distance between the real samples and the generated samples. In order to preserve the complex glyph structure of a historical Chinese character, a weighted mean squared error (MSE) criterion by incorporating both the edge and the skeleton information in the ground truth image is proposed as the weighted pixel loss in the generator. These loss functions are used for joint training of the discriminator and the generator. Experiments are conducted on two tasks to evaluate the performance of the proposed TH-GAN. The first task is carried out on style transfer mapping for multi-font printed traditional Chinese character samples. The second task is carried out on transfer learning for historical Chinese character samples by adding samples generated by TH-GAN. Experimental results show that the proposed TH-GAN is effective.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call