Abstract

Today, the text generation subject in the field of Natural Language Processing (NLP) has gained a lot of importance. In particular, the quality of the text generated with the emergence of new transformer-based models has reached high levels. In this way, controllable text generation has become an important research area. There are various methods applied for controllable text generation, but since these methods are mostly applied on Recurrent Neural Network (RNN) based encoder decoder models, which were used frequently, studies using transformer-based models are few. Transformer-based models are very successful in long sequences thanks to their parallel working ability. This study aimed to generate Turkish reviews on the desired topics by using a transformer-based language model. We used the method of adding the topic information to the sequential input. We concatenated input token embedding and topic embedding (control) at each time step during the training. As a result, we were able to create Turkish reviews on the specified topics.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call