Abstract
Stroke extraction and matching are critical for structural interpretation based applications of handwritten Chinese characters, such as Chinese character education and calligraphy analysis. Stroke extraction from offline handwritten Chinese characters is difficult because of the missing of temporal information, the multi-stroke structures and the distortion of handwritten shapes. In this paper, we propose a comprehensive scheme for solving the stroke extraction problem for handwritten Chinese characters. The method consists of three main steps: (1) fully convolutional network (FCN) based skeletonization; (2) query pixel guided stroke extraction; (3) model-based stroke matching. Specifically, based on a recently proposed architecture of FCN, both the stroke skeletons and cross regions are firstly extracted from the character image by the proposed SkeNet and CrossNet, respectively. Stroke extraction is solved by simulating the human perception that once given a certain pixel from non-cross region of a stroke, the whole stroke containing the pixel can be traced. To realize this idea, we formulate stroke extraction as a problem of pairing and connecting skeleton-wise stroke segments which are adjacent to the same cross region, where the pairing consistency between stroke segments is measured using a PathNet [1]. To reduce the ambiguity of stroke extraction, the extracted candidate strokes are matched with a character model consisting of standard strokes by tree search to identify the correct strokes. For verifying the effectiveness of the proposed method, we train and test our models on character images with stroke segmentation annotations generated from the online handwriting datasets CASIA-OLHWDB and ICDAR13-Online, as well as a dataset of Regularly-Written online handwritten characters (RW-OLHWDB). The experimental results demonstrate the effectiveness of the proposed method and provide several benchmarks. Particularly, the precisions of stroke extraction for ICDAR13-Online and RW-OLHWDB are 89.0% and 94.9%, respectively.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.