Abstract

With the increasing effectiveness of one-/few-shot learning techniques in the context of handwritten character generation and recognition, the call to extend the commonly associated Omniglot challenge is becoming more pressing. However, the sequential Omniglot dataset represents unrealistically written characters. Therefore, we present new data, a new challenge, and a new model as follows: On the data side, we introduce DigiLeTs, a dataset containing 23,870 new trajectories of Latin letters and Arabic numbers from 77 participants with high natural variance within character types. On the challenge side, we introduce the task to imitate handwriting styles in a one-shot manner. On the model side, we extend a generative, recurrent neural network model equipped with a one-shot inference mechanism that allows to reuse previously extracted compositional encodings and that has already been proven promising to solve the original Omniglot challenge. The new model is able to reassemble previously learned components into new characters of new styles in a one-shot manner. Most surprisingly, in a strictly forward manner, it often generates plausible, unknown characters in an already known style. With this work, we hope to inspire future research to investigate how compositional structures develop and are employed for rapid, concept-oriented learning, imitation, and understanding.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call