{"title":"The Accountability of Software Developers for War Crimes Involving Autonomous Weapons","authors":"E. Winter","doi":"10.5195/lawreview.2021.822","DOIUrl":null,"url":null,"abstract":"This Article considers the extent to which the joint criminal enterprise doctrine could be invoked to hold software developers criminally accountable for violations of international humanitarian law involving autonomous weapons. More specifically, it considers whether the third part of the concept—which concerns common criminal purposes—might be brought to bear to achieve this end. The doctrine is deconstructed into five components, and each component is analyzed both in abstract and in terms of practical application. The Article establishes that, in certain contexts, software developers can and should be held accountable through this mechanism. Thus, it demonstrates that it is possible to avoid the emergence of a “responsibility gap” if, or more likely when, autonomous weapons with offensive capabilities are finally deployed on the battlefield. * The author is a Lecturer (Assistant Professor) in International Law at Newcastle University Law School in the United Kingdom. U N I V E R S I T Y O F P I T T S B U R G H L A W R E V I E W P A G E | 5 2 | V O L . 8 3 | 2 0 2 1 ISSN 0041-9915 (print) 1942-8405 (online) ● DOI 10.5195/lawreview.2021.822 http://lawreview.law.pitt.edu INTRODUCTION The International Committee of the Red Cross (ICRC) defines an autonomous weapon as any weapon system with autonomy in its critical functions that can select and attack targets without human intervention.1 The extent to which the use of autonomous weapons might be compatible with substantive obligations in international humanitarian law (IHL) is a complex issue. The author has written previously on the intersection of these “killer robots” with key humanitarian law principles such as distinction,2 proportionality,3 and precaution.4 The present Article represents something of a departure because instead of considering whether the use of autonomous weapons would comply with the law, it focuses on how international criminal law secures individual accountability for violations of IHL involving such weapons. In other words, it considers potential criminal accountability where, for example, a machine targets a civilian, acts in a disproportionate manner, or fails to issue the appropriate warning. This issue is important because the value of any substantive legal rule is dependent, at least in part, on how amenable that rule is to enforcement. As the United Nations (UN) Special Rapporteur, Christof Heyns, noted: “Without the promise of accountability, deterrence and prevention are reduced, resulting in lower protection of civilians and potential victims of war crimes.”5 Thus, if there are no clear consequences for misusing autonomous weapons, individuals who wish to operate them may see this as a license to deploy machines that are not capable of complying with the law. The effect of this would be the deterioration of real-world protections for civilians. Of course, “robots have no moral agency” and cannot be 1 INT’L COMM. RED CROSS, AUTONOMOUS WEAPON SYS.: IMPLICATIONS OF INCREASING AUTONOMY IN THE CRITICAL FUNCTIONS OF WEAPONS 8 (2016), https://icrcndresourcecentre.org/wp-content/uploads/ 2017/11/4283_002_Autonomus-Weapon-Systems_WEB.pdf. 2 Elliot Winter, The Compatibility of Autonomous Weapons with the Principle of Distinction in the Law of Armed Conflict, 69 INT’L & COMPAR. L.Q. 845 (2020). 3 Elliot Winter, Autonomous Weapons in Humanitarian Law: Understanding the Technology, Its Compliance with the Principle of Proportionality and the Role of Utilitarianism, 6 GRONINGEN J. INT’L L. 183 (2018). 4 Elliot Winter, The Compatibility of the Use of Autonomous Weapons with the Principle of Precaution in the Law of Armed Conflict, 58 MIL. L. & L. WAR REV. 240 (2020). 5 Christof Heyns (Special Rapporteur on Extrajudicial, Summary or Arbitrary Executions), Rep. on the Extrajudicial, Summary, or Arbitrary Executions, ¶ 75, U.N. Doc. A/HRC/23/47 (Apr. 9, 2013), https:// www.ohchr.org/Documents/HRBodies/HRCouncil/RegularSession/Session23/A-HRC-23-47_en.pdf [hereinafter Heyns]. T H E A C C O U N T A B I L I T Y O F S O F T W A R E D E V E L O P E R S","PeriodicalId":44686,"journal":{"name":"University of Pittsburgh Law Review","volume":" ","pages":""},"PeriodicalIF":0.2000,"publicationDate":"2021-10-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"University of Pittsburgh Law Review","FirstCategoryId":"90","ListUrlMain":"https://doi.org/10.5195/lawreview.2021.822","RegionNum":4,"RegionCategory":"社会学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"LAW","Score":null,"Total":0}
引用次数: 1
Abstract
This Article considers the extent to which the joint criminal enterprise doctrine could be invoked to hold software developers criminally accountable for violations of international humanitarian law involving autonomous weapons. More specifically, it considers whether the third part of the concept—which concerns common criminal purposes—might be brought to bear to achieve this end. The doctrine is deconstructed into five components, and each component is analyzed both in abstract and in terms of practical application. The Article establishes that, in certain contexts, software developers can and should be held accountable through this mechanism. Thus, it demonstrates that it is possible to avoid the emergence of a “responsibility gap” if, or more likely when, autonomous weapons with offensive capabilities are finally deployed on the battlefield. * The author is a Lecturer (Assistant Professor) in International Law at Newcastle University Law School in the United Kingdom. U N I V E R S I T Y O F P I T T S B U R G H L A W R E V I E W P A G E | 5 2 | V O L . 8 3 | 2 0 2 1 ISSN 0041-9915 (print) 1942-8405 (online) ● DOI 10.5195/lawreview.2021.822 http://lawreview.law.pitt.edu INTRODUCTION The International Committee of the Red Cross (ICRC) defines an autonomous weapon as any weapon system with autonomy in its critical functions that can select and attack targets without human intervention.1 The extent to which the use of autonomous weapons might be compatible with substantive obligations in international humanitarian law (IHL) is a complex issue. The author has written previously on the intersection of these “killer robots” with key humanitarian law principles such as distinction,2 proportionality,3 and precaution.4 The present Article represents something of a departure because instead of considering whether the use of autonomous weapons would comply with the law, it focuses on how international criminal law secures individual accountability for violations of IHL involving such weapons. In other words, it considers potential criminal accountability where, for example, a machine targets a civilian, acts in a disproportionate manner, or fails to issue the appropriate warning. This issue is important because the value of any substantive legal rule is dependent, at least in part, on how amenable that rule is to enforcement. As the United Nations (UN) Special Rapporteur, Christof Heyns, noted: “Without the promise of accountability, deterrence and prevention are reduced, resulting in lower protection of civilians and potential victims of war crimes.”5 Thus, if there are no clear consequences for misusing autonomous weapons, individuals who wish to operate them may see this as a license to deploy machines that are not capable of complying with the law. The effect of this would be the deterioration of real-world protections for civilians. Of course, “robots have no moral agency” and cannot be 1 INT’L COMM. RED CROSS, AUTONOMOUS WEAPON SYS.: IMPLICATIONS OF INCREASING AUTONOMY IN THE CRITICAL FUNCTIONS OF WEAPONS 8 (2016), https://icrcndresourcecentre.org/wp-content/uploads/ 2017/11/4283_002_Autonomus-Weapon-Systems_WEB.pdf. 2 Elliot Winter, The Compatibility of Autonomous Weapons with the Principle of Distinction in the Law of Armed Conflict, 69 INT’L & COMPAR. L.Q. 845 (2020). 3 Elliot Winter, Autonomous Weapons in Humanitarian Law: Understanding the Technology, Its Compliance with the Principle of Proportionality and the Role of Utilitarianism, 6 GRONINGEN J. INT’L L. 183 (2018). 4 Elliot Winter, The Compatibility of the Use of Autonomous Weapons with the Principle of Precaution in the Law of Armed Conflict, 58 MIL. L. & L. WAR REV. 240 (2020). 5 Christof Heyns (Special Rapporteur on Extrajudicial, Summary or Arbitrary Executions), Rep. on the Extrajudicial, Summary, or Arbitrary Executions, ¶ 75, U.N. Doc. A/HRC/23/47 (Apr. 9, 2013), https:// www.ohchr.org/Documents/HRBodies/HRCouncil/RegularSession/Session23/A-HRC-23-47_en.pdf [hereinafter Heyns]. T H E A C C O U N T A B I L I T Y O F S O F T W A R E D E V E L O P E R S
期刊介绍:
The Law Review is a student-run journal of legal scholarship that publishes quarterly. Our goal is to contribute to the legal community by featuring pertinent articles that highlight current legal issues and changes in the law. The Law Review publishes articles, comments, book reviews, and notes on a wide variety of topics, including constitutional law, securities regulation, criminal procedure, family law, international law, and jurisprudence. The Law Review has also hosted several symposia, bringing scholars into one setting for lively debate and discussion of key legal topics.