A new system called GPT-3 is shocking experts with its ability to use and understand language just like humans
Voice has turned thunder among insiders: the world changed this summer with the launch of an artificial intelligence system known as GPT-3. His ability to interact in English and generate coherent writing has astounded even the most experienced, to the point that one speaks of a "GPT-3 shock".
Where typical AI systems are trained for specific tasks (e.g. classifying images) GPT-3 manages tasks for which it has never been specifically trained.
Research published by its creator, the OpenAI, found that GPT-3 can understand text and generate news articles that readers have a hard time distinguishing from those written by people.
The incredible capabilities of GPT-3
AI can perform tasks, I said, that its creators never thought of. Beta testers in recent weeks have found that it can complete a half-written report, produce stories and letters written in the style of famous people, generate business ideas, and even write some kinds of software code based on a simple description of the desired software.
OpenAI has announced that after the trial period, GPT-3 will be released as a commercial product.
Its name, GPT-3, stands for Generative Pre-trained Transformers, Third generation Pre-trained Generative Transformer. Like other AI systems today, GPT-3 relies on a large and organized collection of parameters that determine its operation. The more parameters there are, the more AI is capable.
GPT-3 has 175 billion parameters, which is more than 100 times that of its predecessor, GPT-2, and 10 times that of its closest rival, Microsoft's Turing NLG.
To provide a practical example, here's what a site beta tester reports Simplify.so (which allows you to test this artificial intelligence). You insert a text in English and ask GPT-3 to simplify it.
Someone has copied and pasted the first paragraph of George Washington's 1796 discharge speech: "The time for a new election of a citizen to administer the executive government of the United States is not far off, and the time has actually come when your thoughts must be employed to designate the person who is to be clothed with that important trust, it seems to me appropriate, especially as it can lead to a more distinct expression of the public voice. I should now inform you of the resolution I have formed, to refuse to be considered among those on whom a choice can be made ".
GPT-3 provided his translation: "I'm not going to run for president". More synthesis than this one dies.
When the famous departure of Jane Austen's "Pride and Prejudice" was entered: "It is a universally recognized truth that a single man with a good fortune must be looking for a wife" the artificial intelligence really surprised.
In the first four attempts he gave answers that were not entirely correct (for example, "A man with a lot of money must look for a wife."). On the fifth attempt, the bang: "It is a universally recognized truth, that a man alone with good luck must be eager for a wife, because men are very vain and want to be seen as rich, and women are very greedy and want be seen as beautiful ".
Is GPT-3 somehow "human"?
From the incredible kind of activity that GPT-3 exhibits comes the concrete temptation to anthropomorphize, but we shouldn't. It is a statistical model that has no mental states nor does it commit to reasoning while acting. It's not general AI like Tony Stark's HAL 9000 or JARVIS.
Shreya Shankar, a machine learning engineer at artificial intelligence company Viaduct, says more advanced users can teach the system to do new tasks by presenting them with examples. From there, the AI generalizes what the task is. For example, when he wanted GPT-3 to translate equations from English into mathematical symbols, he started by providing him with some English equations and their equivalents written in symbols.
What changes can GPT-3 bring? Of all sorts
If you travel at this pace, there is a good chance that GPT-3 will make major changes in our working life.
For a number of professionals - journalists, lawyers, programmers, and others - the introduction of systems like GPT-3 will likely shift their businesses from production to review. On the plus side, the end of writer's block. It will be quite easy to keep clicking on the GPT-3 "generate" button until something good appears that just needs to be refined (and enriched with creativity, that is, human).
Risks are also around the corner
As with GPS navigation, which started out as a tool but reduced our commitment to orienting ourselves, AI language generators might start by saving us the work, but they will eventually lift the thought.
And that's just the beginning: AI language models are likely to get even stronger. Creating a more powerful rival than GPT-3 is within the reach of other tech companies. The methods behind machine learning are widely known and the OpenAI data used for training is publicly available.
As GPT-3 has shown the potential of very large models, its 175 billion parameters may soon be surpassed.
Google researchers announced in June that they had built a 600 billion-parameter model for language translation, and Microsoft's researchers said they have their eyes on trillion-parameter models, though not necessarily for language.