{"title":"Advancing Explainability Through AI Literacy and Design Resources","authors":"Patrick Gage Kelley, Allison Woodruff","doi":"10.1145/3613249","DOIUrl":null,"url":null,"abstract":"3 4 I N T E R A C T I O N S S E P T E M B E R – O C T O B E R 2 0 2 3 designed to teach good practices for explainability. Explainability, put simply, provides humanunderstandable reasons and context for decisions made by an AI system [1,2]. In so doing, explainability benefits users and society by helping individuals make informed decisions about their use of AI systems, empowering civic engagement with AI, informing policymakers about the impacts of AI, and more. For these reasons, as AI has become more central in people’s lives, the public, the tech industry, regulators, and others have increasingly recognized explainability as a key aspect of Britni expertly navigates her kayak down the narrow channel, creating a new route to share online (Figure 1). Navigator, her boating app, lets her post routes and gives her a percentage of the advertising revenue. As Britni rounds a bend, a notification comes in from Navigator: “Your route ‘Marshy Inlet Trek’ has been suspended due to safety concerns and will be hidden temporarily from users.” Shocked that her most popular and lucrative (and in her experience very safe!) route has been suspended, Britni begins the process of investigating and contesting the suspension... Navigator is a fictional app we Advancing Explainability Through AI Literacy and Design Resources Patrick Gage Kelley and Allison Woodruff, Google","PeriodicalId":73404,"journal":{"name":"Interactions (New York, N.Y.)","volume":"30 1","pages":"34 - 38"},"PeriodicalIF":0.0000,"publicationDate":"2023-08-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Interactions (New York, N.Y.)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3613249","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
3 4 I N T E R A C T I O N S S E P T E M B E R – O C T O B E R 2 0 2 3 designed to teach good practices for explainability. Explainability, put simply, provides humanunderstandable reasons and context for decisions made by an AI system [1,2]. In so doing, explainability benefits users and society by helping individuals make informed decisions about their use of AI systems, empowering civic engagement with AI, informing policymakers about the impacts of AI, and more. For these reasons, as AI has become more central in people’s lives, the public, the tech industry, regulators, and others have increasingly recognized explainability as a key aspect of Britni expertly navigates her kayak down the narrow channel, creating a new route to share online (Figure 1). Navigator, her boating app, lets her post routes and gives her a percentage of the advertising revenue. As Britni rounds a bend, a notification comes in from Navigator: “Your route ‘Marshy Inlet Trek’ has been suspended due to safety concerns and will be hidden temporarily from users.” Shocked that her most popular and lucrative (and in her experience very safe!) route has been suspended, Britni begins the process of investigating and contesting the suspension... Navigator is a fictional app we Advancing Explainability Through AI Literacy and Design Resources Patrick Gage Kelley and Allison Woodruff, Google
3 4 I N T E R A C T I O N S E P T E M B E R–O C T O B E R 2 0 2 3旨在教授可解释性的良好实践。简单地说,可解释性为人工智能系统做出的决策提供了人性化的理由和背景[1,2]。在这样做的过程中,可解释性通过帮助个人就其对人工智能系统的使用做出明智的决定、增强公民对人工智能的参与、向决策者通报人工智能的影响等,使用户和社会受益。出于这些原因,随着人工智能在人们生活中变得越来越重要,公众、科技行业、监管机构和其他人越来越认识到可解释性是一个关键方面。Britni熟练地在狭窄的通道中驾驶皮划艇,创造了一条新的在线共享路线(图1)。她的划船应用Navigator允许她发布路线,并为她提供一定比例的广告收入。当Britni绕过弯道时,Navigator发出通知:“出于安全考虑,您的路线‘Marshy Inlet Trek’已被暂停,并将暂时对用户隐藏。”Britni对她最受欢迎和最有利可图的(根据她的经验,非常安全!)路线被暂停感到震惊,开始调查和质疑暂停的过程。。。Navigator是一款虚构的应用程序,我们通过人工智能素养和设计资源提高可解释性Patrick Gage Kelley和Allison Woodruff,谷歌