Abstract

In this article is presented the impact of GPT natural language processing models and the evolution of AI on the computational literature. Analyzing the structuring and functioning of these neural networks based on observation, learning (pre-training) and NLP mechanisms, we present the features of GPT models, the range of tasks they can perform, the advantages and risks of their implementation in literary creation. By processing enormous amounts of text to learn how the relationships between natural language words are structured, GPT models can generate both scientific and literary texts. An example of this is the work generated by GPT-3: 1 the Road by Ross Goodwin and Sunspring by Oscar Sharp and Ross Goodwin, presented in this study. The implementation of GPT models in the creative process is also manifested through a set of tools such as: Talk to Transformer, GPT-3 Creative Fiction, Copy.ai, AI Dungeon, etc. Researchers such as Thomas Hornigold, Mark Riedl and others warn that GPT models, although well-versed in various fields, cannot simulate human emotional intelligence, creativity and narrative intelligence, remaining creative tools but not perfect creators.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call