Marcus R. Wigan, G. Adamson, P. Rani, Nick Dyson, Fabian Horton
{"title":"Chatbots and Explainable Artificial Intelligence","authors":"Marcus R. Wigan, G. Adamson, P. Rani, Nick Dyson, Fabian Horton","doi":"10.1109/ISTAS55053.2022.10227122","DOIUrl":null,"url":null,"abstract":"For many areas of artificial intelligence, explainability provides assurance that a decision sits within an acceptable range of possible decisions. In the field of chatbots, however, the function of an AI is to provide an explanation to the user. Users may assume that the purpose of the chatbot is defined by this function. Before we consider the explanatory function of an AI chatbot, we should examine this assumption of purpose. In this research we consider two chatbot cases, the first being where the purpose may not be to inform the user, and secondly, where this should be the purpose. In the commercial sphere we identify two perspectives of AI chatbot purpose: that of the provider, and that of the user. No necessary commonality exists between these two perspectives of purpose. In the government services sphere, methods of increasing the alignment between requested information and appropriate response include “law as code” as a mechanism for simplifying the automation of regulation.","PeriodicalId":180420,"journal":{"name":"2022 IEEE International Symposium on Technology and Society (ISTAS)","volume":"32 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-11-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE International Symposium on Technology and Society (ISTAS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISTAS55053.2022.10227122","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
For many areas of artificial intelligence, explainability provides assurance that a decision sits within an acceptable range of possible decisions. In the field of chatbots, however, the function of an AI is to provide an explanation to the user. Users may assume that the purpose of the chatbot is defined by this function. Before we consider the explanatory function of an AI chatbot, we should examine this assumption of purpose. In this research we consider two chatbot cases, the first being where the purpose may not be to inform the user, and secondly, where this should be the purpose. In the commercial sphere we identify two perspectives of AI chatbot purpose: that of the provider, and that of the user. No necessary commonality exists between these two perspectives of purpose. In the government services sphere, methods of increasing the alignment between requested information and appropriate response include “law as code” as a mechanism for simplifying the automation of regulation.