The Functional Morality of Robots

L. Johansson
{"title":"The Functional Morality of Robots","authors":"L. Johansson","doi":"10.4018/jte.2010100105","DOIUrl":null,"url":null,"abstract":"It is often argued that a robot cannot be held morally responsible for its actions. The author suggests that one should use the same criteria for robots as for humans, regarding the ascription of moral responsibility. When deciding whether humans are moral agents one should look at their behaviour and listen to the reasons they give for their judgments in order to determine that they understood the situation properly. The author suggests that this should be done for robots as well. In this regard, if a robot passes a moral version of the Turing Test-a Moral Turing Test MTT we should hold the robot morally responsible for its actions. This is supported by the impossibility of deciding who actually has semantic or only syntactic understanding of a moral situation, and by two examples: the transferring of a human mind into a computer, and aliens who actually are robots.","PeriodicalId":287069,"journal":{"name":"Int. J. Technoethics","volume":"354 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2010-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"26","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Int. J. Technoethics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.4018/jte.2010100105","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 26

Abstract

It is often argued that a robot cannot be held morally responsible for its actions. The author suggests that one should use the same criteria for robots as for humans, regarding the ascription of moral responsibility. When deciding whether humans are moral agents one should look at their behaviour and listen to the reasons they give for their judgments in order to determine that they understood the situation properly. The author suggests that this should be done for robots as well. In this regard, if a robot passes a moral version of the Turing Test-a Moral Turing Test MTT we should hold the robot morally responsible for its actions. This is supported by the impossibility of deciding who actually has semantic or only syntactic understanding of a moral situation, and by two examples: the transferring of a human mind into a computer, and aliens who actually are robots.
机器人的功能性道德
人们常常认为,机器人不能对自己的行为承担道德责任。作者建议,关于道德责任的归属,人们应该对机器人使用与人类相同的标准。当决定人类是否是道德行为人时,人们应该观察他们的行为,倾听他们为自己的判断给出的理由,以确定他们是否正确地理解了情况。作者建议机器人也应该这样做。在这方面,如果一个机器人通过了道德版的图灵测试——道德图灵测试MTT,我们应该让机器人对自己的行为承担道德责任。这一观点得到了以下两个例子的支持:一是将人类的思想转移到计算机中,二是实际上是机器人的外星人。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信