Florence Nizette, Wafa Hammedi, Allard C.R. van Riel, Nadia Steils
{"title":"Why should I trust you? Influence of explanation design on consumer behavior in AI-based services","authors":"Florence Nizette, Wafa Hammedi, Allard C.R. van Riel, Nadia Steils","doi":"10.1108/josm-05-2024-0223","DOIUrl":null,"url":null,"abstract":"<h3>Purpose</h3>\n<p>This study explores how the format of explanations used in artificial intelligence (AI)-based services affects consumer behavior, specifically the effects of explanation detail (low vs high) and consumer control (automatic vs on demand) on trust and acceptance. The aim is to provide service providers with insights into how to optimize the format of explanations to enhance consumer evaluations of AI-based services.</p><!--/ Abstract__block -->\n<h3>Design/methodology/approach</h3>\n<p>Drawing on the literature on explainable AI (XAI) and information overload theory, a conceptual model is developed. To empirically test the conceptual model, two between-subjects experiments were conducted wherein the level of detail and level of control were manipulated, taking AI-based recommendations as a use case. The data were analyzed via partial least squares (PLS) regressions.</p><!--/ Abstract__block -->\n<h3>Findings</h3>\n<p>The results reveal significant positive correlations between level of detail and perceived understanding and between level of detail and perceived assurance. The level of control negatively moderates the relationship between the level of detail and perceived understanding. Further analyses revealed that the perceived competence and perceived integrity of AI systems positively and significantly influence the acceptance and purchase intentions of AI-based services.</p><!--/ Abstract__block -->\n<h3>Practical implications</h3>\n<p>This research offers service providers key insights into how tailored explanations and maintaining a balance between detail and control build consumer trust and enhance AI-based service outcomes.</p><!--/ Abstract__block -->\n<h3>Originality/value</h3>\n<p>This article elucidates the nuanced interplay between the level of detail and control over explanations for non-expert consumers in high-credence service sectors. The findings offer insights into the design of more consumer-centric explanations to increase the acceptance of AI-based services.</p><!--/ Abstract__block -->","PeriodicalId":48089,"journal":{"name":"Journal of Service Management","volume":"48 1","pages":""},"PeriodicalIF":7.8000,"publicationDate":"2024-12-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Service Management","FirstCategoryId":"91","ListUrlMain":"https://doi.org/10.1108/josm-05-2024-0223","RegionNum":2,"RegionCategory":"管理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MANAGEMENT","Score":null,"Total":0}
引用次数: 0
Abstract
Purpose
This study explores how the format of explanations used in artificial intelligence (AI)-based services affects consumer behavior, specifically the effects of explanation detail (low vs high) and consumer control (automatic vs on demand) on trust and acceptance. The aim is to provide service providers with insights into how to optimize the format of explanations to enhance consumer evaluations of AI-based services.
Design/methodology/approach
Drawing on the literature on explainable AI (XAI) and information overload theory, a conceptual model is developed. To empirically test the conceptual model, two between-subjects experiments were conducted wherein the level of detail and level of control were manipulated, taking AI-based recommendations as a use case. The data were analyzed via partial least squares (PLS) regressions.
Findings
The results reveal significant positive correlations between level of detail and perceived understanding and between level of detail and perceived assurance. The level of control negatively moderates the relationship between the level of detail and perceived understanding. Further analyses revealed that the perceived competence and perceived integrity of AI systems positively and significantly influence the acceptance and purchase intentions of AI-based services.
Practical implications
This research offers service providers key insights into how tailored explanations and maintaining a balance between detail and control build consumer trust and enhance AI-based service outcomes.
Originality/value
This article elucidates the nuanced interplay between the level of detail and control over explanations for non-expert consumers in high-credence service sectors. The findings offer insights into the design of more consumer-centric explanations to increase the acceptance of AI-based services.
期刊介绍:
The Journal of Service Management (JOSM) centers its scope on research in service management. It disseminates papers showcasing distinctive and noteworthy contributions to service literature, serving as a communication platform for individuals in the service management field, transcending disciplines, functional areas, sectors, and nationalities. The journal publishes double-blind reviewed papers emphasizing service literature/theory and its practical applications.