{"title":"Human Morality Difference when Programming and Actually Operating Autonomous Machines","authors":"Wenfeng Yi;Wenhan Wu;Maoyin Chen;Xiaoping Zheng","doi":"10.26599/TST.2024.9010062","DOIUrl":null,"url":null,"abstract":"Autonomous machines (AMs) are poised to possess human-like moral cognition, yet their morality is often pre-programmed for safety. This raises the question of whether the morality intended by programmers aligns with their actions during actual operation, a crucial consideration for a future society with both humans and AMs. Investigating this, we use a micro-robot swarm in a simulated fire scenario, with 180 participants, including 102 robot programmers, completing moral questionnaires and participating in virtual escape trials. These exercises mirror common societal moral dilemmas. Our comparative analysis reveals a “morality gap” between programming presets and real-time operation, primarily influenced by uncertainty about the future and heightened by external pressures, especially social punishment. This discrepancy suggests that operational morality can diverge from programmed intentions, underlining the need for careful AM design to foster a collaborative and efficient society.","PeriodicalId":48690,"journal":{"name":"Tsinghua Science and Technology","volume":"30 4","pages":"1648-1658"},"PeriodicalIF":6.6000,"publicationDate":"2025-03-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10908660","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Tsinghua Science and Technology","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10908660/","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"Multidisciplinary","Score":null,"Total":0}
引用次数: 0
Abstract
Autonomous machines (AMs) are poised to possess human-like moral cognition, yet their morality is often pre-programmed for safety. This raises the question of whether the morality intended by programmers aligns with their actions during actual operation, a crucial consideration for a future society with both humans and AMs. Investigating this, we use a micro-robot swarm in a simulated fire scenario, with 180 participants, including 102 robot programmers, completing moral questionnaires and participating in virtual escape trials. These exercises mirror common societal moral dilemmas. Our comparative analysis reveals a “morality gap” between programming presets and real-time operation, primarily influenced by uncertainty about the future and heightened by external pressures, especially social punishment. This discrepancy suggests that operational morality can diverge from programmed intentions, underlining the need for careful AM design to foster a collaborative and efficient society.
期刊介绍:
Tsinghua Science and Technology (Tsinghua Sci Technol) started publication in 1996. It is an international academic journal sponsored by Tsinghua University and is published bimonthly. This journal aims at presenting the up-to-date scientific achievements in computer science, electronic engineering, and other IT fields. Contributions all over the world are welcome.