Abstract

Abstract- GPT-2 is state of art algorithm transfer learning with respect to nlp task we can do it like text summarizationand many more applications. The text generation application programming interface is supported by a large-scale unsupervised language model capable of generating paragraphs of text This transformer-based language model, based on OpenAI's GPT-2 model, accepts a sentence or partial sentence as input and predicts subsequent text. GPT-2 uses input textto set the initial context for further text generation. The length of an input string can range from few words to a maximumsequence length of 1024 tokens. The longer an initial input, the more subject context is provided to a model. Generally, longer inputs produce a more coherent text output. It was specifically trained to guess the next word in sentences. GPT-2 isa massive model that contains a massive amount of compressed knowledge from various parts of the internet. It can be usedto forecast the likelihood of a sentence. The model learns an internal representation of the English language, which it can then use to extract features useful for subsequent tasks. We have used a tkinter which is a GUI to display the generated output. Keywords—Text generation, Graphical User Interface, gpt2 transformer,Open AI,tkinter,

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.