Data scarcity in paleographic image datasets poses a significant challenge to researchers and scholars in the field. Unlike modern printed texts, historical manuscripts and documents are often scarce and fragile, making them difficult to digitize and create comprehensive datasets. Recently many innovations have been arrived on single image generative models for natural images but none of them are focused on paleographic character images and other handwritten datasets. In paleographic images like stone inscription characters, maintaining exact shape and structure of character is important unlike natural images. In this paper we propose an unconditional single image generative model, CharGAN for isolated paleographic character images. In the proposed system, augmented images are generated from a single image using generative adversarial networks, while maintaining their structure. Specifically, an external augmentation inducer is used to create higher-level augmentations in the generated images. In addition, the input to the generator is replaced with dynamic sampling from a Gaussian mixture model to make changes to the low-level features. From our experimental results, we infer that these two enhancements make single-image generative models suitable not only for natural images, but also for paleographic character images and other handwritten character datasets, the AHCD dataset, and EMNIST, where the global structure is important. Both the qualitative and quantitative results show that our approach is effective and superior in single-image generative tasks, particularly in isolated character image generation.
Read full abstract