{"title":"Soft law for unintentional empathy: addressing the governance gap in emotion-recognition AI technologies","authors":"Andrew McStay , Vian Bakir","doi":"10.1016/j.jrt.2025.100126","DOIUrl":null,"url":null,"abstract":"<div><div>Despite regulatory efforts, there is a significant governance gap in managing emotion recognition AI technologies and those that emulate empathy. This paper asks: should international soft law mechanisms, such as ethical standards, complement hard law in addressing governance gaps in emotion recognition and empathy-emulating AI technologies? To argue that soft law can provide detailed guidance, particularly for research ethics committees and related boards advising on these technologies, the paper first explores how legal definitions of emotion recognition, especially in the EU AI Act, rest on reductive and physiognomic criticism of emotion recognition. It progresses to detail that systems may be designed to intentionally empathise with their users, but also that empathy may be unintentional – or effectively incidental to how these systems work. Approaches that are non-reductive and avoid labelling of emotion as conceived in the EU AI Act raises novel governance questions and physiognomic critique of a more dynamic nature. The paper finds that international soft law can complement hard law, especially when critique is subtle but significant, when guidance is anticipatory in nature, and when detailed recommendations for developers are required.</div></div>","PeriodicalId":73937,"journal":{"name":"Journal of responsible technology","volume":"23 ","pages":"Article 100126"},"PeriodicalIF":0.0000,"publicationDate":"2025-07-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of responsible technology","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2666659625000228","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Despite regulatory efforts, there is a significant governance gap in managing emotion recognition AI technologies and those that emulate empathy. This paper asks: should international soft law mechanisms, such as ethical standards, complement hard law in addressing governance gaps in emotion recognition and empathy-emulating AI technologies? To argue that soft law can provide detailed guidance, particularly for research ethics committees and related boards advising on these technologies, the paper first explores how legal definitions of emotion recognition, especially in the EU AI Act, rest on reductive and physiognomic criticism of emotion recognition. It progresses to detail that systems may be designed to intentionally empathise with their users, but also that empathy may be unintentional – or effectively incidental to how these systems work. Approaches that are non-reductive and avoid labelling of emotion as conceived in the EU AI Act raises novel governance questions and physiognomic critique of a more dynamic nature. The paper finds that international soft law can complement hard law, especially when critique is subtle but significant, when guidance is anticipatory in nature, and when detailed recommendations for developers are required.