If the emergence of generative AI has left you dismayed know that you haven't, we haven't seen anything yet. The genie has come out of the bottle, and it will be difficult for him to get back into it. On the contrary. The race is wilder than ever, and there are six projects vying to create AI models that go beyond 2 trillion parameters. Yes, you read that correctly: trillion.
The titans of AI with 2 trillion parameters
The six major projects competing for the 2 trillion metric milestone are OpenAI, anthropic, Google/Deepmind, Meta, a British government project and one that is still secret. And it's a "deadly" race: economic resources are needed to keep up. Many. Between 1 and 2 billion dollars a year, to constantly update the hardware (increasingly voracious of calculations and energy), hire hundreds of specialists and retain the best team members with millionaire salaries and stock options.
GPT-5: The Return of the King

After taking half the world by surprise, fired a missile at Google and cashed in a lot of money from Microsoft, OpenAI already has a possible KO hit in the works. The one that could ensure supremacy for Sam Altman's company. GPT-5 will be completed by the end of 2023 and released in early 2024, with the number of parameters between 2 and 5 trillion.
We are unable, at the moment, to imagine its capabilities.
Claude-Next: Anthropic and its ambitious project

anthropic, the team founded by former OpenAI employees, is working on a model called Claude-Next, which aims to be 10 times more powerful than current AIs. With $1 billion in funding already raised and $5 billion on the way, Anthropic expects to meet its goals over the next 18 months.
Their flagship model will require 10^25 FLOPs, with the use of clusters consisting of tens of thousands of GPUs. Google is one of Anthropic's backers, playing on multiple tables.
Gemini: Google seeks redemption with Deepmind

Google and Deepmind are collaborating to develop a competitor to GPT-4 called Gemini. The project started recently, after Bard showed di fail to compete with ChatGPT. Gemini will be a large language model with trillions of parameters, similar to GPT-4 or GPT-5, and will use tens of thousands of Google's AI TPU chips for training. It is not yet known whether it will be multimodal.
Deepmind has also developed the web-based chatbot Sparrow, optimized for security and similar to ChatGPT. DeepMind researchers have found that Sparrow's quotes are helpful and accurate 78% of the time. Another top model from DeepMind is chinchilla, trained on 1,4 trillion parameters.
The parameters of an unthinkable future
If you want to get an idea of what 2 trillion parameters mean, know that the estimate of the total amount of usable text data in the world is between 4,6 trillion and 17,2 trillion parameters.
All books, scientific articles, news stories, the entirety of Wikipedia, publicly available code, and much of the rest of the internet, filtered by quality. The digital human knowledge.
As larger models arrive, new capabilities will emerge. Over the next 6 years, there will be improvements in computing power and algorithms to scale models a thousand times, indeed much more.
Nvidia CEO predicted AI models a million times more powerful than ChatGPT within 10 years.
Can you imagine what these artificial intelligences will do with us, what they will do to the planet? Think about it quickly. We're here now.