Ingvild Bode, Hendrik Huelss, Anna Nadibaidze, Guangyu Qiao-Franco, Tom F. A. Watts
{"title":"算法之战:评估一个研究项目","authors":"Ingvild Bode, Hendrik Huelss, Anna Nadibaidze, Guangyu Qiao-Franco, Tom F. A. Watts","doi":"10.1080/13600826.2023.2263473","DOIUrl":null,"url":null,"abstract":"ABSTRACTThis article takes stock of the ongoing debates on algorithmic warfare in the social sciences. It seeks to equip scholars in International Relations and beyond with a critical review of both the empirical context of algorithmic warfare and the different theoretical approaches to studying practices related to the integration of algorithms (including automated, autonomous, and artificial intelligence (AI) technologies) into international armed conflict. The review focuses on discussions about (1) the implications of algorithmic warfare for strategic stability, (2) the morality and ethics of algorithmic warfare, (3) how algorithmic warfare relates to the laws and norms of war, and (4) popular imaginaries of algorithmic warfare. The article foregrounds a set of open research questions capable of moving the field toward a more interdisciplinary research agenda, as well as by introducing the contributions made by other articles in this Special Issue.KEYWORDS: Algorithmsartificial intelligence (AI)warsecurity Disclosure statementNo potential conflict of interest was reported by the author(s).Notes1 Both automated and autonomous technologies denote systems that, once activated, can perform some tasks without human input. In robotics, automation implies less “sophistication” than autonomy because automated systems follow a pre-programmed sequence of actions (Winfield Citation2012, 12). However, integrating automated or autonomous technologies into military decision-making and targeting triggers similar problematic consequences for human control because such technologies increase system complexity.2 AWS are defined as systems that are able to make targeting “decisions” without immediate human intervention. They may or may not be based on AI technologies (Garcia Citationforthcoming).3 Such dynamics are not restricted to the study of algorithmic warfare as the study of remote warfare, for instance, demonstrates (Biegon, Rauta, and Watts Citation2021).4 These include, for example, the Realities of Algorithmic Warfare project (PI: Lauren Gould) at the University of Utrecht and the DILEMA project (PI: Berenice Boutin) at the Asser Institute in The Hague.5 Loitering munitions manufacturers hold that such systems require human assessment and authorisation prior to the release of force. But their marketing material also appears to point to a latent technological capability such systems may have to release the use of force without prior human assessment (Bode and Watts Citation2023).6 The Martens Clause first appeared in the preamble to the 1899 Hague Convention. It is said to “fill a gap” when existing international law fails to address a situation by referring the principles of humanity and dictates of public conscience (Docherty Citation2018).7 Protocol Additional to the Geneva Conventions of 12 August 1949, and Relating to the Protection of Victims of International Armed Conflicts (Protocol I), Art 31/2, Art. 51/4 (b) and Art. 51/4 (c).8 Protocol Additional to the Geneva Conventions of 12 August 1949, and Relating to the Protection of Victims of International Armed Conflicts (Protocol I). Distinction: Article 48, 51(2), and 52 (2). Proportionality: Articles 51(5)(b), 57(2)(a)(iii) and 57(2)(b). Precautions: Article 57 and customary international law.Additional informationFundingThis work was supported by the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation programme (Grant agreement number 852123). Dr. Tom F.A. Watts’ contribution to this paper was funded by a Leverhulme Trust Early Career Research Fellowship (ECF-2022-135).","PeriodicalId":46197,"journal":{"name":"Global Society","volume":"121 1","pages":"0"},"PeriodicalIF":1.7000,"publicationDate":"2023-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Algorithmic Warfare: Taking Stock of a Research Programme\",\"authors\":\"Ingvild Bode, Hendrik Huelss, Anna Nadibaidze, Guangyu Qiao-Franco, Tom F. A. Watts\",\"doi\":\"10.1080/13600826.2023.2263473\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"ABSTRACTThis article takes stock of the ongoing debates on algorithmic warfare in the social sciences. It seeks to equip scholars in International Relations and beyond with a critical review of both the empirical context of algorithmic warfare and the different theoretical approaches to studying practices related to the integration of algorithms (including automated, autonomous, and artificial intelligence (AI) technologies) into international armed conflict. The review focuses on discussions about (1) the implications of algorithmic warfare for strategic stability, (2) the morality and ethics of algorithmic warfare, (3) how algorithmic warfare relates to the laws and norms of war, and (4) popular imaginaries of algorithmic warfare. The article foregrounds a set of open research questions capable of moving the field toward a more interdisciplinary research agenda, as well as by introducing the contributions made by other articles in this Special Issue.KEYWORDS: Algorithmsartificial intelligence (AI)warsecurity Disclosure statementNo potential conflict of interest was reported by the author(s).Notes1 Both automated and autonomous technologies denote systems that, once activated, can perform some tasks without human input. In robotics, automation implies less “sophistication” than autonomy because automated systems follow a pre-programmed sequence of actions (Winfield Citation2012, 12). However, integrating automated or autonomous technologies into military decision-making and targeting triggers similar problematic consequences for human control because such technologies increase system complexity.2 AWS are defined as systems that are able to make targeting “decisions” without immediate human intervention. They may or may not be based on AI technologies (Garcia Citationforthcoming).3 Such dynamics are not restricted to the study of algorithmic warfare as the study of remote warfare, for instance, demonstrates (Biegon, Rauta, and Watts Citation2021).4 These include, for example, the Realities of Algorithmic Warfare project (PI: Lauren Gould) at the University of Utrecht and the DILEMA project (PI: Berenice Boutin) at the Asser Institute in The Hague.5 Loitering munitions manufacturers hold that such systems require human assessment and authorisation prior to the release of force. But their marketing material also appears to point to a latent technological capability such systems may have to release the use of force without prior human assessment (Bode and Watts Citation2023).6 The Martens Clause first appeared in the preamble to the 1899 Hague Convention. It is said to “fill a gap” when existing international law fails to address a situation by referring the principles of humanity and dictates of public conscience (Docherty Citation2018).7 Protocol Additional to the Geneva Conventions of 12 August 1949, and Relating to the Protection of Victims of International Armed Conflicts (Protocol I), Art 31/2, Art. 51/4 (b) and Art. 51/4 (c).8 Protocol Additional to the Geneva Conventions of 12 August 1949, and Relating to the Protection of Victims of International Armed Conflicts (Protocol I). Distinction: Article 48, 51(2), and 52 (2). Proportionality: Articles 51(5)(b), 57(2)(a)(iii) and 57(2)(b). Precautions: Article 57 and customary international law.Additional informationFundingThis work was supported by the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation programme (Grant agreement number 852123). Dr. Tom F.A. Watts’ contribution to this paper was funded by a Leverhulme Trust Early Career Research Fellowship (ECF-2022-135).\",\"PeriodicalId\":46197,\"journal\":{\"name\":\"Global Society\",\"volume\":\"121 1\",\"pages\":\"0\"},\"PeriodicalIF\":1.7000,\"publicationDate\":\"2023-10-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Global Society\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1080/13600826.2023.2263473\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"INTERNATIONAL RELATIONS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Global Society","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1080/13600826.2023.2263473","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"INTERNATIONAL RELATIONS","Score":null,"Total":0}
Algorithmic Warfare: Taking Stock of a Research Programme
ABSTRACTThis article takes stock of the ongoing debates on algorithmic warfare in the social sciences. It seeks to equip scholars in International Relations and beyond with a critical review of both the empirical context of algorithmic warfare and the different theoretical approaches to studying practices related to the integration of algorithms (including automated, autonomous, and artificial intelligence (AI) technologies) into international armed conflict. The review focuses on discussions about (1) the implications of algorithmic warfare for strategic stability, (2) the morality and ethics of algorithmic warfare, (3) how algorithmic warfare relates to the laws and norms of war, and (4) popular imaginaries of algorithmic warfare. The article foregrounds a set of open research questions capable of moving the field toward a more interdisciplinary research agenda, as well as by introducing the contributions made by other articles in this Special Issue.KEYWORDS: Algorithmsartificial intelligence (AI)warsecurity Disclosure statementNo potential conflict of interest was reported by the author(s).Notes1 Both automated and autonomous technologies denote systems that, once activated, can perform some tasks without human input. In robotics, automation implies less “sophistication” than autonomy because automated systems follow a pre-programmed sequence of actions (Winfield Citation2012, 12). However, integrating automated or autonomous technologies into military decision-making and targeting triggers similar problematic consequences for human control because such technologies increase system complexity.2 AWS are defined as systems that are able to make targeting “decisions” without immediate human intervention. They may or may not be based on AI technologies (Garcia Citationforthcoming).3 Such dynamics are not restricted to the study of algorithmic warfare as the study of remote warfare, for instance, demonstrates (Biegon, Rauta, and Watts Citation2021).4 These include, for example, the Realities of Algorithmic Warfare project (PI: Lauren Gould) at the University of Utrecht and the DILEMA project (PI: Berenice Boutin) at the Asser Institute in The Hague.5 Loitering munitions manufacturers hold that such systems require human assessment and authorisation prior to the release of force. But their marketing material also appears to point to a latent technological capability such systems may have to release the use of force without prior human assessment (Bode and Watts Citation2023).6 The Martens Clause first appeared in the preamble to the 1899 Hague Convention. It is said to “fill a gap” when existing international law fails to address a situation by referring the principles of humanity and dictates of public conscience (Docherty Citation2018).7 Protocol Additional to the Geneva Conventions of 12 August 1949, and Relating to the Protection of Victims of International Armed Conflicts (Protocol I), Art 31/2, Art. 51/4 (b) and Art. 51/4 (c).8 Protocol Additional to the Geneva Conventions of 12 August 1949, and Relating to the Protection of Victims of International Armed Conflicts (Protocol I). Distinction: Article 48, 51(2), and 52 (2). Proportionality: Articles 51(5)(b), 57(2)(a)(iii) and 57(2)(b). Precautions: Article 57 and customary international law.Additional informationFundingThis work was supported by the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation programme (Grant agreement number 852123). Dr. Tom F.A. Watts’ contribution to this paper was funded by a Leverhulme Trust Early Career Research Fellowship (ECF-2022-135).
期刊介绍:
Global Society covers the new agenda in global and international relations and encourages innovative approaches to the study of global and international issues from a range of disciplines. It promotes the analysis of transactions at multiple levels, and in particular, the way in which these transactions blur the distinction between the sub-national, national, transnational, international and global levels. An ever integrating global society raises a number of issues for global and international relations which do not fit comfortably within established "Paradigms" Among these are the international and global consequences of nationalism and struggles for identity, migration, racism, religious fundamentalism, terrorism and criminal activities.