Recent advances in a host of technologies including robotics and artificial intelligence have made possible the development of machines with rapidly increasing capacities for autonomous operation. That is, the ability to operate without needing human supervision or control. Military interest in these developments has been strong, as it enables development of weapon systems which can operate by themselves in hostile environments, selecting and engaging targets without human intervention. They promise vast performance, safety and cost advantages over conventional manually operated weapon systems.

Our Research

At PREMT, Tim McFarland and Natalie Nunn are writing doctoral theses that assess the international legal implications of developing and using increasingly autonomous weapon system in armed conflict.

Tim’s work investigates the nature of autonomous weapon systems, including those that exist today and those that are under development, with the aim of understanding whether and in what way use of these advanced weapon systems might present challenges to the law of armed conflict. The essence of automation is replacing a human with a machine. What does that mean in the context of weapon systems? Does the weapon become a ‘soldier’ of sorts, for legal purposes? Is it correct to say an autonomous weapon makes decisions? Or is decision-making power reassigned to other personnel?

Based on that analysis, Tim’s thesis investigates possible legal challenges to use of increasingly autonomous weapon systems in three areas of the law of armed conflict. In relation to weapons law, the thesis discusses legal limitations on autonomous capabilities as well as challenges in reviewing autonomous weapons and the emerging notion of a requirement for ‘meaningful human control’ over weapon system operation. In relation to targeting law, the thesis investigates legal obligations attached to use of an autonomous weapon system in an attack during an armed conflict. In relation to accountability, the thesis discusses the challenges in determining who is responsible when an autonomous weapon system commits some proscribed act.