{"title":"Empirical Data on Attitudes Toward Autonomous Systems","authors":"Jai C. Galliott, Bianca Baggiarini, Sean Rupka","doi":"10.1093/oso/9780197546048.003.0010","DOIUrl":null,"url":null,"abstract":"Combat automation, enabled by rapid technological advancements in artificial intelligence and machine learning, is a guiding principle in the conduct of war today. Yet, empirical data on the impact of algorithmic combat on military personnel remains limited. This chapter draws on data from a historically unprecedented survey of Australian Defence Force Academy cadets. Given that this generation of trainees will be the first to deploy autonomous systems (AS) in a systematic way, their views are especially important. This chapter focuses its analysis on five themes: the dynamics of human-machine teams; the perceived risks, benefits, and capabilities of AS; the changing nature of (and respect for) military labor and incentives; preferences to oversee a robot, versus carrying out a mission themselves; and the changing meaning of soldiering. We utilize the survey data to explore the interconnected consequences of neoliberal governing for cadets’ attitudes toward AS, and citizen-soldiering more broadly. Overall, this chapter argues that Australian cadets are open to working with and alongside AS, but under the right conditions. Armed forces, in an attempt to capitalize on these technologically savvy cadets, have shifted from institutional to occupational employers. However, in our concluding remarks, we caution against unchecked technological fetishism, highlighting the need to critically question the risks of AS on moral deskilling, and the application of market-based notions of freedom to the military domain.","PeriodicalId":145178,"journal":{"name":"Lethal Autonomous Weapons","volume":"23 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Lethal Autonomous Weapons","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1093/oso/9780197546048.003.0010","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
Combat automation, enabled by rapid technological advancements in artificial intelligence and machine learning, is a guiding principle in the conduct of war today. Yet, empirical data on the impact of algorithmic combat on military personnel remains limited. This chapter draws on data from a historically unprecedented survey of Australian Defence Force Academy cadets. Given that this generation of trainees will be the first to deploy autonomous systems (AS) in a systematic way, their views are especially important. This chapter focuses its analysis on five themes: the dynamics of human-machine teams; the perceived risks, benefits, and capabilities of AS; the changing nature of (and respect for) military labor and incentives; preferences to oversee a robot, versus carrying out a mission themselves; and the changing meaning of soldiering. We utilize the survey data to explore the interconnected consequences of neoliberal governing for cadets’ attitudes toward AS, and citizen-soldiering more broadly. Overall, this chapter argues that Australian cadets are open to working with and alongside AS, but under the right conditions. Armed forces, in an attempt to capitalize on these technologically savvy cadets, have shifted from institutional to occupational employers. However, in our concluding remarks, we caution against unchecked technological fetishism, highlighting the need to critically question the risks of AS on moral deskilling, and the application of market-based notions of freedom to the military domain.