{"title":"Where is Vincent? Expanding our emotional selves with AI","authors":"Minha Lee, L. Frank, Y. D. Kort, W. Ijsselsteijn","doi":"10.1145/3543829.3543835","DOIUrl":null,"url":null,"abstract":"In what ways could the future of emotional bonds between humans and conversational AI change us? To explore this question in a multi-faceted manner, designers, engineers, philosophers as separate focus groups were given a design fiction probe— a story of a chatbot’s disappearance from a person’s life. Though articulated in discipline-specific ways, participants expressed similar concerns and hopes: 1) caring for a machine could teach people to emotionally care for themselves and others, 2) the boundary between human and non-human emotions may become blurred when people project their own emotions onto AI, e.g., a bot’s ”breakdown” as one’s own, and 3) people may then intertwine their identities with AI through emotions. We consider ethical ramifications of socially constructed emotions between humans and conversational agents.","PeriodicalId":138046,"journal":{"name":"Proceedings of the 4th Conference on Conversational User Interfaces","volume":"26 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-07-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 4th Conference on Conversational User Interfaces","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3543829.3543835","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 6
Abstract
In what ways could the future of emotional bonds between humans and conversational AI change us? To explore this question in a multi-faceted manner, designers, engineers, philosophers as separate focus groups were given a design fiction probe— a story of a chatbot’s disappearance from a person’s life. Though articulated in discipline-specific ways, participants expressed similar concerns and hopes: 1) caring for a machine could teach people to emotionally care for themselves and others, 2) the boundary between human and non-human emotions may become blurred when people project their own emotions onto AI, e.g., a bot’s ”breakdown” as one’s own, and 3) people may then intertwine their identities with AI through emotions. We consider ethical ramifications of socially constructed emotions between humans and conversational agents.