Autonomous weapons are increasingly used by militaries around the world. Unlike conventional unmanned weapons such as drones, autonomous weapons involve a machine deciding whether to deploy lethal force. Yet, because a machine cannot have the requisite mental state to commit a war crime, the legal scrutiny falls onto the decision to deploy an autonomous weapon. This Article focuses on the dual questions arising from that decision: how to regulate autonomous weapon use and who should be held criminally liable for an autonomous weapon’s actions. Regarding the first issue, this Article concludes that regulations expressly limiting autonomous weapon use to non-human targets are preferable to a complete ban on autonomous weapons. Regarding the second issue, this Article concludes that in light of the legal constraints on autonomous weapon use and criminal punishment, the appropriate entities to hold criminally liable for an autonomous weapon’s actions are the combatant who deployed the weapon and the commander who either supervised the combatant or ordered the deployment. Ultimately, this Article emphasizes that although the Law of War already restricts the legal use of autonomous weapons to non-human targets through the principle of distinction, both the International Criminal Court and individual states should clarify how they will enforce limitations on autonomous weapon use before the technology advances.
Autonomous Weapons and Accountability: Seeking Solutions in the Law of War,
48 Loy. L.A. L. Rev. 1017
Available at: http://digitalcommons.lmu.edu/llr/vol48/iss3/11