Fears about the exponential growth of artificial intelligence systems are rapidly reaching fever pitch, to the point that one industry expert believes the only way to stop it is to bomb data centers. This is the shock proposal of Eliezer Yudkowski, researcher in the field of machine learning, in an editorial for Timemagazine.
Who is Eliezer Yudkowsky?

Yudkowsky has spent over twenty years studying Artificial General Intelligence (AGI) and warning of the possible dire consequences. This is why it goes far beyond the alarm raised by characters such as Elon Musk, Steve Wozniak e Andrew Yang in the open letter of the Future of Life Institute, which calls for a six-month break in AI development. Yudkowsky argues that is not enough.
The expert says that the only way to avoid a catastrophe is to radically block AI. The idea of him? "Shut down all large GPU clusters (the computer farms where artificial intelligences more powerful) and impose a limit on the computing power that can be used for AI training, reducing it over time. No exceptions, not even for governments and military entities".
What if someone were to break these rules? Yudkowsky has no doubts: "Be ready to destroy a rebel data center with an airstrike."

Well-founded fears or psychosis?
The tone of Yudkowsky's concerns perplexes me: while I agree that ethics and attention are needed in addressing these technologies, I find them hysterical.
Expert Yudkowsky says he fears his daughter Nina won't survive to adulthood if ever more intelligent AIs continue to develop.
It therefore invites all those who share these concerns to adopt a firm position, because otherwise their children could also be in danger.
"We'll All Die"
"The most likely outcome of building a superhuman artificial intelligence, in anything remotely like current circumstances, is that literally everyone on Earth will die."
I have no difficulty calling him a catastrophist. Sure, if someone who's truly dedicated his life to studying the dangers of AI's dystopian future says we're getting close to the thing he warned about, his take might be worth hearing.
We must not indulge in rejection, however. We must support progress: as mentioned, ethics and attention. No fear and above all no obscurantism.