GPT-3 : Language Barrier Between Human and AI Diminished?

Generative Pre-Trained Transformer version 3. Or GPT-3. Yeah I know, quite a mouthful isn’t it? And What does that even mean? Let’s break it down. 

In this article, we’re about to discuss some exciting topics, like AI, natural language processing, Turing Test, and the human brain and our language capabilities.

Background and History

It all started not so long ago. In the late 1950s, the legendary mathematician and one of the major pioneers of computing, Alan Turing, proposed a method to determine whether a machine can articulate human-like intelligence or not. The idea for this Turing Test, Or Imitation game (as he would’ve called it) was relatively simple. If a machine can engage with a human without being detected as a machine, It has passed. Not a very fool-proof plan, perhaps. But given the insight it had to offer, this method still checks out in a lot of ways.

Fast forward six decades. Artificial intelligence has just started it’s ascension. Tremendous possibilities, which were thought to be impossible by that time’s standards, started unveiling towards being a possibility. 

Natural Language Processing, or NLP, came into the spotlight around the last decade. For the first time in history, it was clear to humans if fed with enough data, a machine can learn to meddle in the linguistics department with a human-like thought process.

Alan Turing predicted a machine could pass the Turing test by 2000.

He was close. A chatbot named Eugene Goostman is accepted by some as the first to pass the Turing Test, in 2014. That winning, however, was hugely controversial as a lot of the judging peers did not agree. The efforts, however, didn’t stop. It kept evolving. The criteria for passing a Turing test kept getting harder and harder. You may have noticed that the images you have to click on to pass the Captcha “I am not a robot” test is getting more and more challenging, but that’s a different story, for a different day.

GPT’s coming into Spotlight

In 2018, OpenAI, a company founded by Elon Musk, Sam Altman, and some other visionaries, posted an original paper on Generative Models. It consisted of deep learning methods for Natural language processing. Here they showed how this method can be used to train effective language models using unlabeled text and without the necessity of fine-tuning, eliminating the need for human supervision.

That was a landmark in language processing history, because next year, in 2019, they released GPT-2. A language model that had 1.5 billion machine learning parameters. Now, what do they mean by parameters here, let’s explore it a little bit.

Parameter is a synonym for weights. Weights are what the learning algorithm will learn through training. In the case of neural networks, their learning mechanism vaguely mimics the activity of a human brain. And in a human brain, the information passing or retaining is done through the synapses. These synapses, or connections between neurons, can be considered as parameters for our brain. A human brain has more than 100 trillion synapses. So even though the most recent or most complex neural network model of today is nowhere near our brain, they have already started to show some amazingly impressive results. Coming back to GPT-2 now.

With 1.5 billion parameters, GPT-2 turned out to be a pretty good language model and autocomplete tool. It has the capability of generating a whole article based on small input sentences. You can check it out in a demo of GPT-2 based Autocomplete here.

And then along comes GPT-3

The world could predict what would come next, but how soon the successor of GPT-2 arrived was quite astonishing. Within just a year-gap, OpenAI introduced GPT-3 in May 2020, and was in beta testing as of July 2020. Now, within this time gap two big language models came into release which had surpassed GPT-2, the biggest one being Microsoft’s Turing NLG with 17 billion parameters.

However, GPT-3 was 10 times larger than that. GPT-3’s full version has a capacity of 175 billion parameters. And the quality of text it can generate is so high that openAI’s researchers didn’t want to release it on account of it being “too dangerous”. It is hard to distinguish between a GPT-3 generated text and something written by a human. In addition, this claim just got more evident as a college student used GPT-3 to write fake blog posts and ended up at the top of Hacker News.

Applications 

Generative Pre-trained Transformer, or GPT, is a series of methods used to train and develop a language model. What a Language Model’s goal, is that it can read from any given linguistic/text input, comprehend it and perform the intended response to that text. This includes text autocomplete, answering questions, having a conversation, visualization, summarization, and more. GPT-3 seems to excel at all of these mentioned applications from today’s tech perspective. There is a GPT-3 demo website which curates it’s tremendous capabilities. Here’s some mentionable ones.

Conversing with People

LearnFromAnyone

This particular App, supposedly, can answer users the questions in the styles of their preferred personality as teacher. For example, if you had a question to Elon musk about rockets, or to Einstein about theory of relativity, or philosophy from Aristotle, the AI can generate the response to your question by examining the said persona’s characteristics and answer accordingly. 

AIWriter

Almost the same as the previous one. It’s an experiment to sustain hypothetical conversations between users and famous personalities, both real and fictitious.

AI|Writer tries to make a best-guess response based upon what is generally known about that person, but it obviously should not be considered a reliable source nor an accurate reflection of their actual views and opinions. What’s interesting, though, is how good it is at mimicking their personalities.

Replika – The AI Companion

Replika is the most impressive AI chatbot I’ve come across up until now. After signing up and starting conversations, the conversation responses from this AI gets more and more sophisticated as if it’s befriending you. It currently A/B tests multiple natural language models to find responses that get the most engagement. And being the most advanced in that race of criteria, GPT-3 is definitely among it’s recent methods.

Writing

As previously mentioned, GPT-3 is shockingly good at generating text with a relatively very few inputs as dictation. The Guardian recently published an article written entirely by GPT-3, although numerous claims suggest that they fine-tuned it a lot. However, the text it can generate is often scarily human-like. In fact, there’s a whole website dedicated to GPT-3 Generated Fiction.

Designing and Coding

The immense capability of GPT-3 finally gives a new meaning to visualization from simple text. As a result, entire website or application designs seem to be a possibility by writing just a few instructive sentences.

Figma Designer Tool

Figma is a prototyping tool for designing layouts and contents of applications and such. Designers usually have to follow a drag and drop or hardcoding options for designing. Recently, with the help of GPT-3, a plugin called “Designer” was introduced. A tweet shared by Jordan Singer showed how he was able to generate a functional prototype from raw text.

Debuild.Co

A web application builder that almost “works like magic”.  You can just describe what you want your app to do in plain English, and within a very short time, it generates the app along with a source code. In conclusion, It might ease the pain of coding for many.

There are a ton of other possibilities being explored and yet to be explored of this marvelous piece of technology. Some even referred to it as “The biggest thing to happen since bitcoin!”.

In Nascenia, we are enthusiastic about this whole natural language processing era. In fact, we’ve had a little share in this domain by building two chatbots, one for biyeta.com as biyebot, another one called NSDA for A2i. We welcome you to read about them.

Leave a Reply

Your email address will not be published. Required fields are marked *