Automation is not a moral deus ex machina: electrophysiology of moral reasoning toward machine and human agents

Q2 Arts and Humanities
Federico Cassioli, L. Angioletti, M. Balconi
{"title":"Automation is not a moral deus ex machina: electrophysiology of moral reasoning toward machine and human agents","authors":"Federico Cassioli, L. Angioletti, M. Balconi","doi":"10.4081/mem.2022.1217","DOIUrl":null,"url":null,"abstract":"The diffusion of automated decision-making systems could represent a critical crossroads for the future society. Automated technology could feasibly be involved in morally-charged decisions, with major ethical consequences. In the present study, participants (n=34) took part in a task composed of moral dilemmas where the agent (human vs. machine) and the type of behavior (action vs inaction) factors were randomized. Responses in terms of evaluation of morality, the consciousness, responsibility, intentionality, and emotional impact of the agent’s behaviour, reaction times (RTs), and EEG (delta, theta, beta, alpha, gamma powers) data were collected. Data showed that participants apply different moral rules based on the agent. Humans are considered more moral, responsible, intentional, and conscious compared to machines. Interestingly, the evaluation of the emotional impact derived from the moral behavior was perceived as more severe for humans, with decreased RTs. For EEG data, increased gamma power was detected when subjects were evaluating the intentionality and the emotional impact of machines, compared to humans. Higher beta power in the frontal and fronto-central regions was detected for the evaluation of the machine’s derived emotional impact. Moreover, a right temporal activation was found when judging the emotional impact caused by humans. Lastly, a generalized alpha desynchronization occurred in the left occipital area, when subjects evaluated the responsibility derived from inaction behaviors. Present results provided evidence for the existence of different norms when judging moral behavior of machine and human agents, pointing to a possible asymmetry in moral judgment at a cognitive and emotional level.","PeriodicalId":36708,"journal":{"name":"Medicina e Morale","volume":" ","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2022-12-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Medicina e Morale","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.4081/mem.2022.1217","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"Arts and Humanities","Score":null,"Total":0}
引用次数: 0

Abstract

The diffusion of automated decision-making systems could represent a critical crossroads for the future society. Automated technology could feasibly be involved in morally-charged decisions, with major ethical consequences. In the present study, participants (n=34) took part in a task composed of moral dilemmas where the agent (human vs. machine) and the type of behavior (action vs inaction) factors were randomized. Responses in terms of evaluation of morality, the consciousness, responsibility, intentionality, and emotional impact of the agent’s behaviour, reaction times (RTs), and EEG (delta, theta, beta, alpha, gamma powers) data were collected. Data showed that participants apply different moral rules based on the agent. Humans are considered more moral, responsible, intentional, and conscious compared to machines. Interestingly, the evaluation of the emotional impact derived from the moral behavior was perceived as more severe for humans, with decreased RTs. For EEG data, increased gamma power was detected when subjects were evaluating the intentionality and the emotional impact of machines, compared to humans. Higher beta power in the frontal and fronto-central regions was detected for the evaluation of the machine’s derived emotional impact. Moreover, a right temporal activation was found when judging the emotional impact caused by humans. Lastly, a generalized alpha desynchronization occurred in the left occipital area, when subjects evaluated the responsibility derived from inaction behaviors. Present results provided evidence for the existence of different norms when judging moral behavior of machine and human agents, pointing to a possible asymmetry in moral judgment at a cognitive and emotional level.
自动化不是道德上的解围:对机器和人类代理的道德推理的电生理学
自动化决策系统的普及可能是未来社会的一个关键十字路口。自动化技术可能会参与涉及道德的决策,并产生重大的道德后果。在本研究中,参与者(n=34)参加了一个由道德困境组成的任务,其中代理人(人类vs.机器)和行为类型(行动vs.不作为)因素是随机的。在道德评价、意识、责任、意向性和行为人行为的情绪影响方面的反应、反应时间(RTs)和脑电图(δ、θ、β、α、γ功率)数据被收集。数据显示,参与者根据不同的行为人,运用不同的道德规则。与机器相比,人类被认为更有道德、更负责任、更有意识。有趣的是,对道德行为产生的情感影响的评估被认为对人类更严重,RTs减少。对于脑电图数据,与人类相比,当受试者评估机器的意向性和情感影响时,检测到增加的伽马功率。在评估机器产生的情感影响时,在额叶和额中央区域检测到更高的β功率。此外,在判断人类造成的情绪影响时,发现了右侧颞叶激活。最后,当受试者评估由不作为行为产生的责任时,左枕区出现了普遍的α不同步。本研究结果为机器和人类主体在判断道德行为时存在不同的规范提供了证据,指出在认知和情感层面上可能存在道德判断的不对称。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Medicina e Morale
Medicina e Morale Arts and Humanities-Philosophy
CiteScore
0.70
自引率
0.00%
发文量
21
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信