jump to navigation

Ethics of Military Robots February 10, 2010

Posted by emiliekopp in industry robot spotlight.
Tags: , , , , ,

I recently became familiar with the DoD’s Unmanned Systems Integrated Roadmap, a document that forecasts the evolution and adoption of robot technologies in modern warfare. For a government document, it was actually a pretty interesting read.

Many people are timid when discussing military robots, and justifiably so. While most of the robots are meant to perform tasks that are simply too dull, dirty or dangerous to warrant the risk of human life (for instance, MULE robots or robots that dispose of IEDs), most of the mainstream media attention is geared towards the robots with guns. And that’s when the references to Skynet come rolling in.

What happens when robots have guns? If something goes wrong, who is ultimately held responsible? The robot? The operator? The designer? The supplier of electromechanical parts? The chain of responsible parties could go on and on.

So we haven’t found the answer. But initiating the conversations is an good start.

P.W. Singer’s Wired for War has brought the conversation to the mainstream. The main point Singer addresses is that once you begin to move humans away from the battlefield (i.e give the guns to robots), they become more willing to use force. So as you reduce the risk of human life on one side, you become more willing to shed human life on the other. Understandably scary.

Another resource I found incredibly interesting is a report prepared for the US Department of Navy’s Office of Naval Research by California Polytechnic State University: Autonomous Military Robotics: Risk, Ethics, and Design. This report takes a more technical approach to understanding the ethics behind robots in the battlefield. While it addresses many of the concerns that Singer has brought to light, it also entertains the point that robots are “unaffected by the emotions, adrenaline, and stress that cause soldiers to overreact or deliberately overstep the Rules of Engagement and commit atrocities, that is to say, war crimes.” Of course, this assumes that humans are capable of programming robots to make ethically sound decisions on their own, which in turn warrants a walk down memory lane with Asimov’s 3 Laws of Robotics.

Bottom line: technology is a double-edged sword (thank you, Ray Kurzweil). There will be pro’s and con’s to exponentially-advancing technologies, especially with battlefield robots. Yet, we shouldn’t feel like we need to tip toe around the issue. The more we talk about it, the better equipped we’ll be when decisions must be made.