Artificial intelligence (AI) will transform our lives. It will affect almost all aspects of society: business, education, transport, medicine, even politics. In most places, it will be a good thing, removing chore and improving productivity. But there is a place where I fear his arrival, and it is in the army.
(Read the Britannica test of Sherry Turkle on robots and humanity.)
The world will be a worse place if, in 20 years, the military use mortal autonomous weapons systems (laws) because there are no laws on laws. The media like to call them “killer robots. “The problem of calling them” killer robots “is that it evokes an image of The terminator. But it’s not The terminator Who worries me or worries me or thousands of my colleagues working in AI. These are much simpler technologies that are, at best, a decade or more. Take an existing predator drone And replace the human pilot with a computer – is technically possible today.
The attractiveness of these technologies is obvious. The lowest link of a drone is the radio link at the base. The drones were sabotaged by blurring their radio link. Have the drone fly, the track and the target for itself, and you have the perfect weapon from a technological perspective. He will never sleep. He became a halftown 24/7. He will have superhuman precision and reflexes.
However, there are many reasons for which it will be a terrible development war. It will be a revolution in war. The first war revolution was the invention of cannon powder. The second was the invention of nuclear weapons. And it will be the third. Everyone was a change in speed and efficiency with which we could kill our opponents.
It will be Weapons of mass destruction. Previously, if you wanted to hurt, you had to have an army of soldiers to wage war. Now you would need a single programmer. Like all other weapons of mass destruction before her, like chemical, biological and nuclear weapons, we will need to ban these weapons.
These will be weapons of terror. They will fall into the hands of terrorists and rugs which will have no qualms about turning them on civilian populations. They will be an ideal weapon with which to remove a civilian population. Unlike humans, they will not hesitate to commit atrocities, even genocide.
It won’t be more ethics that human soldiers. We do not know today how to build autonomous weapons that will follow international humanitarian law and know no computer system that cannot be hacked. And there are many bad players who will replace all the guarantees that could be implemented.
These weapons will destabilize an already fragile geopolitical order. It will take only a modest banking balance to have a powerful army. They will reduce the barriers to war. We can even have “flash” wars when the robots opposed embark on unexpected feedback loops.
They will be the Kalashnikov of the future. Unlike nuclear weapons, they will be cheap and easy to produce. This does not mean that they cannot be prohibited. Chemical weapons are inexpensive and easy to produce but have been prohibited. And we do not need to develop autonomous weapons such as a means of deterrence against those who could ignore a ban – we do not develop chemical weapons to dissuade those who could sometimes use chemical weapons. We already have a lot of deterrent, military, economic and diplomatic, with whom those who choose to ignore international treaties.
Above all, there is a deep moral argument according to which we abandon an essential part of our humanity if we put the machines to go to the decision to know if someone lives or dies.
Let’s do not to come down This road.
This test was initially published in 2018 in 2018 in Encyclopædia Britannica Anniversary Edition: 250 years of excellence (1768-2018).