Abstract

Writer identification has steadily progressed in recent decades owing to its widespread application. Scenarios with extensive handwriting data such as page-level or sentence-level have achieved satisfactory accuracy, however, word-level offline writer identification is still challenging owing to the difficulty of learning good feature representations with scant handwriting data. This paper proposes a new Residual Swin Transformer Classifier (RSTC), comprehensively aggregating local and global handwriting styles and yielding robust feature representations with single-word images. Local information is modeled by the Transformer Block through interacting strokes. Global information is featurized by holistic encoding using the Identity Branch and Global Block. Moreover, the pre-training technique is exploited to transfer reusable knowledge learned from a task similar to writer identification, strengthening the model’s representation of handwriting features. The proposed method is tested on the IAM and CVL benchmark datasets and achieves state-of-the-art performance, which demonstrates RSTC’s superior modeling capability on word-level writer identification.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call