Fears about the exponential growth of AI systems are quickly reaching fever pitch, so much so that one industry expert believes the only way to stop it is to bomb data centers. This is the shock proposal from Eliezer Yudkowski, researcher in the field of machine learning, in an editorial for Timemagazine.
Who is Eliezer Yudkowsky?
Yudkowsky has spent over two decades studying Artificial General Intelligence (AGI) and warning of its possible disastrous consequences. This is why it goes far beyond the alarm raised by people like Elon Musk, Steve Wozniak e Andrew Yang in the open letter of the Future of Life Institute, which calls for a six-month pause in AI development. Yudkowsky argues that it is not enough.
The expert says that the only way to avoid a catastrophe is to radically block AI. The idea of him? “Shut down all the large GPU clusters (the computer farms where the artificial intelligences more powerful) and impose a limit on the computing power that can be used for training AI, reducing it over time. No exceptions, not even for governments and military entities."
What if someone were to break these rules? Yudkowsky has no doubts: “Be ready to destroy a rebel data center with an air strike.”
Well-founded fears or psychosis?
The tone of Yudkowsky's concerns puzzles me: as much as I agree that ethics and attention are needed in the direction of these technologies, I find them hysterical.
Expert Yudkowsky says he fears his daughter Nina will not survive to adulthood if increasingly intelligent AIs continue to develop.
It therefore invites all those who share these concerns to adopt a firm position, because otherwise their children could also be in danger.
“We will all die”
“the most likely outcome of building a superhuman artificial intelligence, in anything remotely similar to current circumstances, is that literally everyone on Earth will die.”
I have no difficulty in defining him as a catastrophist. Of course, if someone who has truly dedicated their life to studying the dangers of AI's dystopian future says we're getting closer to the thing they warned about, their opinion might be worth listening to.
We must not indulge in rejection, however. We must support progress: as mentioned, ethics and attention. No fear and above all no obscurantism.