Abstract

Neural text generation is the process of a training neural network to generate a human understandable text (poem, story, article). Recurrent Neural Networks and Long-Short Term Memory are powerful sequence models that are suitable for this kind of task. In this paper, we have developed two types of language models, one generating news articles and the other generating poems in Macedonian language. We developed and tested several different model architectures, among which we also tried transfer-learning model, since text generation requires a lot of processing time. As evaluation metric we used ROUGE-N metric (Recall-Oriented Understudy for Gisting Evaluation), where the generated text was tested against a reference text written by an expert. The results showed that even though the generate text had flaws, it was human understandable, and it was consistent throughout the sentences. To the best of our knowledge this is a first attempt in automatic text generation (poems and articles) in Macedonian language using Deep Learning.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call