Killer robots’ liability issues
Last month, the United Nations hosted an informal meeting of experts in Geneva to talk about killer robots. The fact that we need an international meeting to talk about killer robots should scare the Asimov out of us, and the TV networks and papers pointed out it was happening— then promptly lost interest. Too bad, because diplomats, lawyers and your various Professor Frinks who can likely make your roomba hurt you, all got together to discuss important stuff about Lethal Autonomous Weapons Systems: LAWS.
Pick your favorite sci-fi book or movie premise for a scenario. What if M.O.N.T.E. malfunctions and goes berserk? (You know, The Big Bang Theory’s killer robot, the Mobile Omnidirectional Neutralization and Termination Eradicator.) What if he goes on a killing spree of civilians instead of combatants? If he’s autonomous, who’s responsible for the war crime?
The possibilities worry Human Rights Watch and the Harvard Law School’s International Human Rights Clinic, which jointly released a report on how scary, bad things could happen and no one might be held accountable. Bonnie Docherty, senior arms division researcher at Human Rights Watch, was its lead author. She points out that killer robots could even bulldoze some loopholes in products liability law, at least in the U.S. “Product liability requires victims to show that there was a defect, there was a problem in the manufacturing or a problem in the design of the product—in this case, the fully autonomous weapon. But today, courts have found it difficult to adapt those rules to very high-tech technology…”
We offered our own premise to Docherty. Modern computers, the Internet and many other innovations started as defence industry applications, and given that some U.S. and Canadian police forces have gone on a macho buying spree lately of armored cars, sound cannons and other gear, it’s entirely conceivable that killer robots could be the clanking deputies used to break up a riot in the homeland. Say Robocop blasts a hole in the nearby gas station—blowin’ it up real good—and kills five or 20 people. Who’s culpable?
Docherty says our scenario exemplifies the organization’s concern, and “in our research, we found that no human would be held responsible for such acts, whether these weapons are used on the battlefield or at home in law enforcement situations.” In such cases, she says, human rights law would apply, not the laws of war, triggering a whole new area of international jurisprudence, where multiple standards would be violated.
For the huddle in Geneva last month, the experts were just talking. Another meeting will take place in Geneva in November, and Docherty hopes the participating countries will increase attention given to the issue, with the ultimate goal of negotiating an international protocol (while the rest of us may hope Sarah Connor will show up as the keynote).
But here’s another big problem. Just as biotech outpaced the law, from IVF to cloning, who’s to say Vladimir Putin doesn’t have Cylons running around tunnels under the Kremlin? Programming moves faster than the gavel. Docherty concedes the point, but counters that “the fact that this technology is moving so rapidly, and we don’t even know everything that’s out there, is all the more reason that countries need to act now. They shouldn’t be waiting.” She says that for international law, discussion on this issue has actually moved at a relatively quick pace: “But it’s important to keep the pace up.”
Think it’s all too much? Remember: Siri’s on your friend’s phone. You can buy a drone at Canadian Tire. The tech has seemingly endless nightmare possibilities, and we already live in dread of the human suicide bomber—imagine a cyborg martyr who isn’t a martyr at all. Imagine Robocop stomping along in the streets and suddenly compromised by a virus. Address these issues now, or that could be life in the big city.
Copyright 2015 Rogers Publishing Ltd. This article first appeared in the May 2015 edition of Corporate Risk Canada magazine