{"title":"Emerging Scholars on Emerging Technologies in International Security: Introduction to Part 1","authors":"Anna Nadibaidze, Ingvild Bode","doi":"10.1111/1758-5899.70058","DOIUrl":null,"url":null,"abstract":"<p>Technologies such as artificial intelligence (AI), blockchain, cyber, quantum, and robotics have wide-ranging implications for world politics, international security, and warfare. These potential impacts, whether in terms of opportunities or challenges, deserve to be comprehensively examined not only across disciplines, but also across generations of scholars. The two-part special section “Emerging Scholars on Emerging Technologies in International Security” contributes to ongoing efforts of conceptualizing emerging technologies in international security by bringing together the research of early career scholars working in this space. It disseminates ideas from a new generation of thinkers in an area that is growing in importance and relevance for world politics. The first part of this special section includes five articles, each with different theoretical and empirical areas of focus, but all written by early career researchers in the spirit of exploring the diversity of topics under the umbrella term of “emerging technologies.”</p><p>Kwik draws attention to the risks of sycophancy, defined as a tendency exhibited by AI models to produce outputs that match the user's views, despite being factually misleading or wrong. As Kwik demonstrates, sycophancy in military applications of AI is an understudied phenomenon which deserves attention due to the risks it produces in warfare. To mitigate these risks, such as military personnel's over trust in AI models, Kwik recommends a combination of technical measures and educational tools to train users on operating AI systems.</p><p>Ølgaard explores the dynamics and interactions between particular imaginaries surrounding emerging technologies in global security. Drawing on Science and Technology Studies (STS), Ølgaard's contribution highlights the presence of these imaginaries in the discourses of both states (the United States) and international institutions (NATO). This analysis of what Ølgaard calls “the new technopolitics of war” convincingly shows that the coconstitution of technology and politics happening across these imaginaries is critical to understand both the formation of certain types of agency in human-machine interactions as well as of authority in public-private relations.</p><p>Imre-Millei investigates how Canadian operators of uncrewed aerial vehicles, or drones, view military identity and ethics of drone use. Based on ethnographic fieldwork and interviews, Imre-Millei's analysis finds that Canadian drone operators perceive their roles as part of their combat arms identity, while connecting this identity to how they understand the role of the Canadian army in the world. With this original empirical work, Imre-Millei contributes to scholarship on the use of drones by small- and medium-sized states, as well as to literature at the intersection between emerging military technologies, national identity, and ethics.</p><p>Nadibaidze demonstrates the importance of exploring the role of relatively new actors in the development and supply of military technologies: tech startups specializing in defense AI, especially those funded by venture capital. Inspired by STS scholarship, Nadibaidze's study maps out six key US-based startups in the sphere of AI development and analyzes key themes in the discourses promoted by representatives of these startups in the public space. It contributes to research on the integration of AI technologies into the military domain by highlighting the growing influence of startups and their role in normalizing particular visions of algorithmic warfare.</p><p>Zhang investigates the role of two China-based think tanks in shaping AI governance frameworks, arguing that these actors deserve to be examined due to their role as boundary brokers. Inspired by the literature on communities of practice (CoPs), Zhang's analysis reveals that think tanks in the Chinese context are influential actors that act as boundary brokers between the government and other domestic CoPs. They play a dual role of disseminating governmental policies on AI governance, while also using their positions to subtly influence these policies.</p><p>Together, these five articles encompass an exploration of ongoing trends (whether technological or political), various actors developing, using or regulating technologies (whether state or nonstate), as well as gradually evolving or already established norms addressing challenges related to emerging technologies in global security. They involve different theoretical approaches including STS and practice theories, as well as different methodologies including ethnographic studies, discourse analysis, and interviews. This diversity showcases the need to continue investigating technologies such as AI, drones, and robotics from various vantage points, to further enrich scholarship on emerging technologies.</p><p>This special section stems from an early career researcher workshop held in Copenhagen, Denmark on November 1, 2024, bringing together 11 early career researchers from around the world and across disciplines such as political science, security studies, international law, and STS. The authors presented their drafts and received discussant-type feedback from the other participants as well as three senior discussants: Ingvild Bode (University of Southern Denmark), Neil Renic (University of Copenhagen), and Alexander Blanchard (Stockholm International Peace Research Institute). The format focused on providing constructive feedback and an emphasis on mentorship, training, networking, and building a community. The workshop was a crucial milestone in the process of preparing the manuscripts for submission, as it allowed knowledge sharing about the academic publication process among participants and with input from senior discussants. We are grateful to the Carlsberg Foundation (grant CF24-0046), the European Research Council (grant no. 852123, the AutoNorms project), and the Center for War Studies at the University of Southern Denmark for supporting the organization of this workshop.</p><p>The authors declare no conflicts of interest.</p>","PeriodicalId":51510,"journal":{"name":"Global Policy","volume":"16 3","pages":"465-466"},"PeriodicalIF":2.2000,"publicationDate":"2025-07-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1111/1758-5899.70058","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Global Policy","FirstCategoryId":"90","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1111/1758-5899.70058","RegionNum":3,"RegionCategory":"社会学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"INTERNATIONAL RELATIONS","Score":null,"Total":0}
引用次数: 0
Abstract
Technologies such as artificial intelligence (AI), blockchain, cyber, quantum, and robotics have wide-ranging implications for world politics, international security, and warfare. These potential impacts, whether in terms of opportunities or challenges, deserve to be comprehensively examined not only across disciplines, but also across generations of scholars. The two-part special section “Emerging Scholars on Emerging Technologies in International Security” contributes to ongoing efforts of conceptualizing emerging technologies in international security by bringing together the research of early career scholars working in this space. It disseminates ideas from a new generation of thinkers in an area that is growing in importance and relevance for world politics. The first part of this special section includes five articles, each with different theoretical and empirical areas of focus, but all written by early career researchers in the spirit of exploring the diversity of topics under the umbrella term of “emerging technologies.”
Kwik draws attention to the risks of sycophancy, defined as a tendency exhibited by AI models to produce outputs that match the user's views, despite being factually misleading or wrong. As Kwik demonstrates, sycophancy in military applications of AI is an understudied phenomenon which deserves attention due to the risks it produces in warfare. To mitigate these risks, such as military personnel's over trust in AI models, Kwik recommends a combination of technical measures and educational tools to train users on operating AI systems.
Ølgaard explores the dynamics and interactions between particular imaginaries surrounding emerging technologies in global security. Drawing on Science and Technology Studies (STS), Ølgaard's contribution highlights the presence of these imaginaries in the discourses of both states (the United States) and international institutions (NATO). This analysis of what Ølgaard calls “the new technopolitics of war” convincingly shows that the coconstitution of technology and politics happening across these imaginaries is critical to understand both the formation of certain types of agency in human-machine interactions as well as of authority in public-private relations.
Imre-Millei investigates how Canadian operators of uncrewed aerial vehicles, or drones, view military identity and ethics of drone use. Based on ethnographic fieldwork and interviews, Imre-Millei's analysis finds that Canadian drone operators perceive their roles as part of their combat arms identity, while connecting this identity to how they understand the role of the Canadian army in the world. With this original empirical work, Imre-Millei contributes to scholarship on the use of drones by small- and medium-sized states, as well as to literature at the intersection between emerging military technologies, national identity, and ethics.
Nadibaidze demonstrates the importance of exploring the role of relatively new actors in the development and supply of military technologies: tech startups specializing in defense AI, especially those funded by venture capital. Inspired by STS scholarship, Nadibaidze's study maps out six key US-based startups in the sphere of AI development and analyzes key themes in the discourses promoted by representatives of these startups in the public space. It contributes to research on the integration of AI technologies into the military domain by highlighting the growing influence of startups and their role in normalizing particular visions of algorithmic warfare.
Zhang investigates the role of two China-based think tanks in shaping AI governance frameworks, arguing that these actors deserve to be examined due to their role as boundary brokers. Inspired by the literature on communities of practice (CoPs), Zhang's analysis reveals that think tanks in the Chinese context are influential actors that act as boundary brokers between the government and other domestic CoPs. They play a dual role of disseminating governmental policies on AI governance, while also using their positions to subtly influence these policies.
Together, these five articles encompass an exploration of ongoing trends (whether technological or political), various actors developing, using or regulating technologies (whether state or nonstate), as well as gradually evolving or already established norms addressing challenges related to emerging technologies in global security. They involve different theoretical approaches including STS and practice theories, as well as different methodologies including ethnographic studies, discourse analysis, and interviews. This diversity showcases the need to continue investigating technologies such as AI, drones, and robotics from various vantage points, to further enrich scholarship on emerging technologies.
This special section stems from an early career researcher workshop held in Copenhagen, Denmark on November 1, 2024, bringing together 11 early career researchers from around the world and across disciplines such as political science, security studies, international law, and STS. The authors presented their drafts and received discussant-type feedback from the other participants as well as three senior discussants: Ingvild Bode (University of Southern Denmark), Neil Renic (University of Copenhagen), and Alexander Blanchard (Stockholm International Peace Research Institute). The format focused on providing constructive feedback and an emphasis on mentorship, training, networking, and building a community. The workshop was a crucial milestone in the process of preparing the manuscripts for submission, as it allowed knowledge sharing about the academic publication process among participants and with input from senior discussants. We are grateful to the Carlsberg Foundation (grant CF24-0046), the European Research Council (grant no. 852123, the AutoNorms project), and the Center for War Studies at the University of Southern Denmark for supporting the organization of this workshop.