Secondo the latest report of the United Nations on the Libyan civil war, the year that is ending has seen for the first time an autonomous weapons system, a killer robot, kill someone. In the future we could remember it as the starting point of a new arms race that could hurt us, very badly.
In recent days (from 13 to 17 December) La UN Convention on Certain Conventional Weapons met on the subject, as happens every five years. Since 1983 this assembly has chosen to limit some cruel weapons such as anti-personnel mines. This time he discussed a possible ban on these autonomous weapons. And it hasn't reached a consensus to limit them.

It may have been a catastrophic mistake.
Autonomous weapons systems are killer robots. Throughout this post you will always hear me call them that: killer robots. Because this is what I am. Systems with weapons that can function alone. Systems made for war, to kill people. Global governments are investing heavily in this trend.
On the other hand, obviously, the humanitarian organizations (one of them, Stop The Killer Robots, you find here) are dedicated to asking for rules and bans on the development of these weapons. And they do well. If we don't limit the use of autonomous weapons, it will end very badly. Disruptive technologies in this field will be able to destabilize all military strategies, increasing the risk of pre-emptive attacks. Including chemical, biological, nuclear ones.
Given the pace of development of killer robots, the UN meeting that just ended may have been one of the last occasions to avoid a new arms race. Or the last one.
There are four dangers that appear obvious to anyone when it comes to killer robotic systems.
First problem of killer robots: identification.
will autonomous weapons ALWAYS be able to distinguish between a hostile soldier and a child with a toy gun? The difference between a single human error (which is also possible) and the incorrect setting of an algorithm could bring the problem to an incredible scale. An autonomous weapons expert called Paul Scharre uses a metaphor: that of a faulty machine gun that continues to fire even when you take your finger off the trigger. And shoot until the ammo runs out, because it's just a machine. He doesn't know he's making a mistake.
Whatever anyone says, artificial intelligence does not yet have its own morals (and he will probably never have it, he will never be able to learn it).
The problem isn't just that even an AI is wrong, for example when it recognizes asthma as a factor which reduces the risk of pneumonia, or when it identifies people of color like gorilla. It's just that when it makes a mistake, whoever created it doesn't know why he made a mistake, and doesn't know how to correct it. For this reason I believe it is impossible that killer robots can develop with any kind of "moral" criterion.
Second problem with killer robots: low-end proliferation.
Armed forces that are developing autonomous weapons assume that they will be able to contain and control them. Have you heard a stupider idea than this? If there is one thing that the history of weapons technology teaches us, only one thing, it is that weapons spread. Also in this case it was very predictable.
What happened in the past with the Kashinkov, an assault rifle that has become so accessible that it ends up in the hands of anyone, can be repeated for killer robots. Market pressures can lead to the creation of autonomous weapons that are effective, cheap, and virtually impossible to stop. Most of all: widespread. In the hands of governments, crazy horses, organized crime or terrorist groups.
It may have already happened. The Kargu-2, made by a Turkish defense contractor, is a cross between a drone and a bomb. It has artificial intelligence to find and track targets. It is a killer robot, and has already acted autonomously in the theater of the Libyan civil war to attack people.
Third problem of killer robots: high-end proliferation.
If we then think about the "high-end" risks we arrive at Zenit. Nations could compete to develop increasingly devastating versions of autonomous weapons, including those capable of mount chemical, biological, radiological and nuclear weapons . The moral dangers of escalating gun lethality would be amplified by escalating gun use.
Sure, these killer robots are likely to come with expensive ones ethical controllers designed to minimize collateral damage, chasing the myth of the "surgical" attack. Good stuff only for public opinion, in short. The truth is that autonomous weapons will alter even the most banal cost-benefit analysis you do before planning a war. They will be dice of a deadly risk, to be thrown with less worry.
Asymmetrical wars waged on the ground by nations lacking competing technologies will become more common. An enormously widespread instability.
Fourth and final problem: the laws of war
Killer robots (if they proliferate is a certainty) will undermine humanity's last palliative against war crimes and atrocities: the international laws of war. These laws, codified in treaties starting from the first Geneva Convention, they are the fine line that separates war from something even worse that I can hardly imagine.
The laws of war are fundamental, because they also impose responsibilities on those who are fighting a war. Slobodan Milosevic was the president of a country and had to answer for his actions. He was tried in the United Nations International Criminal Tribunal for the former Yugoslavia and he had to answer for what he did.
And now? Is a killer robot committing war crimes to blame? Who is being prosecuted? The weapon? The soldier? The soldier's commanders? The company that made the weapon? NGOs and international law experts fear that autonomous weapons lead to a serious liability gap .
It will have to be proven that a soldier committed a crime by using an autonomous weapon. To do this, it will have to be proven that the soldier committed a guilty act and also had the specific intention to do so. A rather complicated dynamic, in a world of killer robots.
A world of killer robots is a world with no rules they impose significant human control on weapons. It is a world where war crimes will be committed with no war criminals to be held accountable. The structure of the laws of war, together with their deterrent value, will be considerably weakened.
A new global arms race
Imagine that everyone can use all the strength they want, when they want. With fewer consequences for anyone. Imagine a planet where national and international military, insurgent groups and terrorists can deploy a theoretically unlimited lethal force at theoretically zero risk at times and places of their choosing, with no resulting legal liability.