The energy consumed by Google's artificial intelligence can reach levels comparable to that of the whole of Ireland. You read it in the title, I repeat it here: the figures speak for themselves. As AI evolves and expands, the industry is facing a prohibitive energy challenge.
The more technology companies continue to develop and integrate AI into a wide range of services, from the simple search engine to the most complex applications, the more the associated energy consumption will become a crucial issue. The question is: how do we get out of this?
The rise of custom AI chips
The trend is clear and growing: AI companies are developing their own chips to meet increasingly demanding system demands. Giants like Google e Amazon they already have their own custom AI chips. And they are not the only ones: there are persistent rumors that Microsoft may unveil its chip hardware next month.
Microsoft itself has also invested heavily in OpenAI, which, according to some sources, is in the early stages of developing its own chips or is considering acquiring a semiconductor company to make them.
But what does all this mean for our planet? It means it will be there a significant increase in the energy footprint of the AI industry.
The energy footprint of the AI industry
I'll put it simply: if generative AI is integrated into every Google search, the company's energy demand will reach incredible heights. In an article published on Joule (I link it here), researchers estimate that integrating a chatbot similar to Chat GPT in every search Google would request as many as 512,820 NVIDIA A100 HGX servers. Translated into numbers? It means beyond 4 million GPUs. Do the math: with an energy demand of 6,5 kW per server, the daily electricity consumption would be 80 GWh, the annual consumption would be 29,2 TWh. They are those of an entire nation like Ireland.
When AI "drinks" more
AI tools have a initial training phase followed by a inference phase. While the training phase is the most energy intensive and has been the focus of AI sustainability research thus far, the inference phase is when these tools generate output based on the data they were trained on.
This phase, often overlooked, deserves absolute attention. Because this phase will increase dramatically, and will end up outclassing the previous one. Estimates of the "energy hunger" of various artificial intelligence systems need to be revised.
We cannot afford to ignore the energy consumed by these systems, technological progress and environmental responsibility must be balanced: only in this way will we be able to give technology a real chance to improve our future.