Treaties banning biological and chemical weapons are in place, and the path is clear to remove nuclear weapons too. Lethal autonomous weapons (killer robots) should be next.
The ethics and psychology of trust suggest ways we might learn to understand self-driving cars, but also show why doing so might be more challenging than we expect.
Rebel fighters in the latest Star Wars movie are helped by a droid that was captured from the enemy and reprogrammed. Could that happen in real life with today’s autonomous weapons?
We need to ban lethal autonomous weapons, or “killer robots”, as we have done with biological weapons, land mines and blinding lasers, and Australia should take a leading role in making that happen.
Autonomous submarines might do for naval warfare what drones are doing for air warfare. So should Australia consider autonomous subs as a replacement for the Collins class?
The moral and ethical dilemmas of future warfare are depicted in this tight British thriller. But what will happen when humans become more removed from the weapons of war?
Science fiction has long warned of technology taking over the world. We’re increasingly connected to a digital world that’s growing, and more automated. So what if it starts to evolve?
The thousands of people who signed an open letter calling for a ban on autonomous killer weapons and robots are misguided. We already have such killing machines and we should embrace them.
The debate over whether lethal autonomous weapon systems (LAWS) – often called ‘killer robots’ – should be banned continues, although it’s far from settled.
Should future wars be fought by autonomous systems? Or do they pose such a threat that they should be banned? These issues are being debated this week by diplomats from around the world.
Lecturer on Law and Associate Director of Armed Conflict and Civilian Protection, International Human Rights Clinic, Harvard Law School, Harvard University