Page Loader
#NewsBytesExplainer: GPT-3, the powerful AI that generates stories, poems, code

#NewsBytesExplainer: GPT-3, the powerful AI that generates stories, poems, code

Sep 02, 2020
04:41 pm

What's the story

Back in May, OpenAI, the AI start-up co-founded by Elon Musk, had announced an advanced language processing algorithm called 'GPT-3'. The artificial intelligence model was released to select users last week and has since been making headlines owing to its mind-boggling text-generating capabilities. Now, the question is - what makes GPT-3 so special and can it really produce useful content? Let's find out.

GPT

First, some background of the GPT model

Generative pre-training transformer or GPT is an AI-based language model that uses a generative adversarial network (GAN) framework, involving a set of neural networks competing against each other, to produce text from a given excerpt. The model was announced in 2018 and upgraded to GPT-2 a year later. That system was more sophisticated at predicting words and writing full paragraphs of content.

Risk

GPT-2's code was restricted over concerns of fake news

GPT-2 was so quick and effective at generating texts that the team at OpenAI was worried that it may be used to spread fake news and hoaxes. So, instead of releasing the full code of the AI model, they launched a watered-down version of the algorithm to the public, allowing it to create self-rewriting text games, among other things.

GPT-3

Now, GPT-3, the most powerful text generator, is here

Now, building on the work done, OpenAI has released GPT-3 to select users through an API. The new model beats its predecessor and is the most powerful language model ever built. You just input text, and it uses it to capture a certain pattern and write an entire article or poem, depending on how the API is configured, in the same way.

Training

What GPT-3 is capable of doing?

In the few days since GPT-3's release, a number of samples of text generated by the AI model have started circulating on the internet. If appropriately configured by a human, the model can generate some really good stuff, starting from creative fiction and handy business memos to a working JavaScript code and even news stories.

Twitter Post

Here's an example of GPT-3's prowess

Working

How the AI model works so effectively?

GPT-3 achieves these mind-blowing results using the trove of data fed into it. The algorithm has been trained effectively on all of the text available on the internet or some 500 billion words (GPT-2 was trained on 40 billion). Through this ginormous dataset, it identifies certain linguistic patterns humans cannot see and uses them to predict what should be added to a given text.

Repetition

Then, it repeats this pattern

Once it generates an output for a given text, it can combine both the generated and original content and treat it as input to produce the next batch of text. This way, based on all published content, it can continue and generate long articles, poems, or news stories, complete with punctuation as well as the context.

Caveats

But, there are some caveats too

While GPT-3 makes for a powerful text generator, it comes with certain caveats one should know about. First, it must be noted that the text generator uses fed data patterns to generate a word-by-word output, but it does not have its own understanding of the text given in or given out. It cannot grasp the meaning of input and reason like humans.

Information

Kevin Lacker demonstrated how the GPT-3 could be stumped

In a blog post, former Parse co-founder Kevin Lacker shared a Q&A session with GPT-3 to show that the AI could answer general trivia questions that are available on the internet and, presumably, included in its training data but fails to reject basic nonsensical queries.

Coherence

Coherence can also be a problem

Along with reasoning, the AI language model can also struggle with the problem of coherence. Specifically, unlike humans who can continue to write with the same narrative and tone, GPT-3 can go off-track. It generates words and phrases using the surrounding input and, therefore, might fail to maintain a consistent narrative or meaning in long passages. The sentences can even contradict each other.

Twitter Post

OpenAI CEO Sam Altman also acknowledged GPT-3's problems

Possibilities

Still, a tool of this kind is no minor feat

Despite the downsides, a text generation tool of this kind can be used for various applications by entrepreneurs and developers. It can lead us to a future where text generation is automated in most, if not all, of the cases. But, do not worry (about your job) just yet, as the ability to create original and truly meaningful text appears to be far away.