A new system called GPT-3 is shocking experts with its ability to use and understand language just like humans
Among industry insiders, the rumor has become thunder: the world has changed this summer with the launch of an artificial intelligence system known as GPT-3. Its ability to interact in English and generate coherent writing has amazed even the most experts, to the point that it is called "GPT-3 shock".
Where typical AI systems are trained for specific tasks (e.g. classifying images) GPT-3 manages tasks for which it has never been specifically trained.
Research published by its creator, the OpenAI, found that GPT-3 can understand text and generate news articles that readers have a hard time distinguishing from those written by people.
The incredible capabilities of GPT-3
AI can carry out tasks, I was saying, that its creators never thought of. Beta testers in recent weeks have found that it can complete a half-written report, produce stories and letters written in the style of famous people, generate business ideas, and even write some types of software code based on a simple description of the desired software.
OpenAI has announced that after the trial period, GPT-3 will be released as a commercial product.
Its name, GPT-3, stands for Generative Pre-trained Transformers, Third Generation Pre-trained Generative Transformer. Like other AI systems today, GPT-3 relies on a large and organized collection of parameters that determine how it works. The more parameters there are, the more capable the artificial intelligence is.
GPT-3 has 175 billion parameters, which is more than 100 times that of its predecessor, GPT-2, and 10 times that of its closest rival, Microsoft's Turing NLG.
Road tests
To provide a practical example, here's what a site beta tester reports Simplify.so (which allows you to test this artificial intelligence). You insert a text in English and ask GPT-3 to simplify it.
Someone copied and pasted the first paragraph of George Washington's 1796 farewell speech: “The period for a new election of a citizen to administer the executive government of the United States is not very distant, and the time is indeed come when your thoughts must be employed in designating the person who is to be invested with that important trust, seems to me appropriate, especially as it may lead to a more distinct expression of the public voice. I should now inform you of the resolution I have formed, to refuse to be counted among those upon whom a choice may be made."
GPT-3 provided his translation: “I have no intention of running for president.” More synthesis than this one dies.
When the famous departure from Jane Austen's “Pride and Prejudice” was entered: “It is a truth universally acknowledged that a single man in possession of a good fortune must be in want of a wife” the artificial intelligence really surprised.
In the first four attempts he gave answers that were not entirely correct (e.g., “A man with a lot of money must look for a wife.”). On the fifth attempt, the bang: “It is a truth universally acknowledged, that a single man with a good fortune must be desirous of a wife, for men are very vain and want to be seen as rich, and women are very greedy and want be seen as beautiful."
Is GPT-3 somehow “humane”?
From the incredible type of activity that GPT-3 displays comes the real temptation to anthropomorphize, but we shouldn't. It is a statistical model that has no mental states nor does it engage in reasoning while acting. It's not a general artificial intelligence like HAL 9000 or Tony Stark's JARVIS.
Shreya Shankar, a machine learning engineer at artificial intelligence company Viaduct, says more advanced users can teach the system to perform new tasks by presenting them with examples. From there, the AI generalizes what the task is. For example, when he wanted GPT-3 to translate equations from English into mathematical symbols, he started by providing it with some equations in English and their equivalents written in symbols.
What changes can GPT-3 bring? Of all sorts
If you travel at this pace, there is a good chance that GPT-3 will make major changes in our working life.
For a number of professionals (journalists, lawyers, programmers and others) the introduction of systems like GPT-3 will likely shift their activities from production to review. On the plus side, the end of writer's block. It will be quite simple to continue clicking on the "generate" button of GPT-3 until something good appears that just needs to be refined (and enriched with human creativity, yes).
Risks are also around the corner
As with GPS navigation, which began as a tool but reduced our effort to orient ourselves, AI language generators may start out saving us work, but will eventually take the guesswork out of it.
And that's just the beginning: AI language models are likely to get even stronger. Creating a more powerful rival than GPT-3 is within the reach of other tech companies. The methods behind machine learning are widely known and the OpenAI data used for training is publicly available.
As GPT-3 has shown the potential of very large models, its 175 billion parameters may soon be surpassed.
Google researchers announced in June that they had built a 600 billion-parameter model for language translation, and Microsoft's researchers said they have their eyes on trillion-parameter models, though not necessarily for language.