Abstract

Pantun is a traditional Malay poem consisting of four lines: two lines of deliverance and two lines of messages. Each ending-line word in pantun forms an ABAB rhyme pattern. In this work, we automatically generated Indonesian pantun by applying two existing generative models: Sequential GAN (SeqGAN) and Generative Pre-trained Transformer 2 (GPT-2). We also created a 13K Indonesian pantun dataset by collecting pantun from various sources. We evaluated how well each model produced pantun by its formedness. Measured by two aspects: structure and rhyme. GPT-2 performs better with a margin of 27.57% than SeqGAN in forming the structure and 22.79% better in making rhyming patterns.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call