{"title":"“道德机器”的黑暗面和自动驾驶汽车计算伦理决策的谬误","authors":"H. Etienne","doi":"10.1080/17579961.2021.1898310","DOIUrl":null,"url":null,"abstract":"ABSTRACT This paper reveals the dangers of the Moral Machine experiment, alerting against both its uses for normative ends, and the whole approach it is built upon to address ethical issues. It explores additional methodological limits of the experiment on top of those already identified by its authors and provides reasons why it is inadequate in supporting ethical and juridical discussions to determine the moral settings for autonomous vehicles. Demonstrating the inner fallacy behind computational social choice methods when applied to ethical decision-making, it also warns against the dangers of computational moral systems, such as the ‘voting-based system’ recently developed out of the Moral Machine’s data. Finally, it discusses the Moral Machine’s ambiguous impact on public opinion; on the one hand, laudable for having successfully raised global awareness with regard to ethical concerns about autonomous vehicles, and on the other hand pernicious, as it has led to a significant narrowing of the spectrum of autonomous vehicle ethics, de facto imposing a strong unidirectional approach, while brushing aside other major moral issues.","PeriodicalId":37639,"journal":{"name":"Law, Innovation and Technology","volume":"13 1","pages":"85 - 107"},"PeriodicalIF":0.0000,"publicationDate":"2021-01-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1080/17579961.2021.1898310","citationCount":"15","resultStr":"{\"title\":\"The dark side of the ‘Moral Machine’ and the fallacy of computational ethical decision-making for autonomous vehicles\",\"authors\":\"H. Etienne\",\"doi\":\"10.1080/17579961.2021.1898310\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"ABSTRACT This paper reveals the dangers of the Moral Machine experiment, alerting against both its uses for normative ends, and the whole approach it is built upon to address ethical issues. It explores additional methodological limits of the experiment on top of those already identified by its authors and provides reasons why it is inadequate in supporting ethical and juridical discussions to determine the moral settings for autonomous vehicles. Demonstrating the inner fallacy behind computational social choice methods when applied to ethical decision-making, it also warns against the dangers of computational moral systems, such as the ‘voting-based system’ recently developed out of the Moral Machine’s data. Finally, it discusses the Moral Machine’s ambiguous impact on public opinion; on the one hand, laudable for having successfully raised global awareness with regard to ethical concerns about autonomous vehicles, and on the other hand pernicious, as it has led to a significant narrowing of the spectrum of autonomous vehicle ethics, de facto imposing a strong unidirectional approach, while brushing aside other major moral issues.\",\"PeriodicalId\":37639,\"journal\":{\"name\":\"Law, Innovation and Technology\",\"volume\":\"13 1\",\"pages\":\"85 - 107\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-01-02\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://sci-hub-pdf.com/10.1080/17579961.2021.1898310\",\"citationCount\":\"15\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Law, Innovation and Technology\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1080/17579961.2021.1898310\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"Social Sciences\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Law, Innovation and Technology","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1080/17579961.2021.1898310","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"Social Sciences","Score":null,"Total":0}
The dark side of the ‘Moral Machine’ and the fallacy of computational ethical decision-making for autonomous vehicles
ABSTRACT This paper reveals the dangers of the Moral Machine experiment, alerting against both its uses for normative ends, and the whole approach it is built upon to address ethical issues. It explores additional methodological limits of the experiment on top of those already identified by its authors and provides reasons why it is inadequate in supporting ethical and juridical discussions to determine the moral settings for autonomous vehicles. Demonstrating the inner fallacy behind computational social choice methods when applied to ethical decision-making, it also warns against the dangers of computational moral systems, such as the ‘voting-based system’ recently developed out of the Moral Machine’s data. Finally, it discusses the Moral Machine’s ambiguous impact on public opinion; on the one hand, laudable for having successfully raised global awareness with regard to ethical concerns about autonomous vehicles, and on the other hand pernicious, as it has led to a significant narrowing of the spectrum of autonomous vehicle ethics, de facto imposing a strong unidirectional approach, while brushing aside other major moral issues.
期刊介绍:
Stem cell research, cloning, GMOs ... How do regulations affect such emerging technologies? What impact do new technologies have on law? And can we rely on technology itself as a regulatory tool? The meeting of law and technology is rapidly becoming an increasingly significant (and controversial) topic. Law, Innovation and Technology is, however, the only journal to engage fully with it, setting an innovative and distinctive agenda for lawyers, ethicists and policy makers. Spanning ICTs, biotechnologies, nanotechnologies, neurotechnologies, robotics and AI, it offers a unique forum for the highest level of reflection on this essential area.