Assessing the impact of information on patient attitudes toward artificial intelligence-based clinical decision support (AI/CDS): a pilot web-based SMART vignette study.
{"title":"Assessing the impact of information on patient attitudes toward artificial intelligence-based clinical decision support (AI/CDS): a pilot web-based SMART vignette study.","authors":"Bohye Kim, Katie Ryan, Jane Paik Kim","doi":"10.1136/jme-2024-110080","DOIUrl":null,"url":null,"abstract":"<p><strong>Background: </strong>It is increasingly recognised that the success of artificial intelligence-based clinical decision support (AI/CDS) tools will depend on physician and patient trust, but factors impacting patients' views on clinical care reliant on AI have been less explored.</p><p><strong>Objective: </strong>This pilot study explores whether, and in what contexts, detail of explanation provided about AI/CDS tools impacts patients' attitudes toward the tools and their clinical care.</p><p><strong>Methods: </strong>We designed a Sequential Multiple Assignment Randomized Trial vignette web-based survey. Participants recruited through Amazon Mechanical Turk were presented with hypothetical vignettes describing health concerns and were sequentially randomised along three factors: (1) the level of detail of explanation regarding an AI/CDS tool; (2) the AI/CDS result; and (3) the physician's level of agreement with the AI/CDS result. We compared mean ratings of comfort and confidence by the level of detail of explanation using t-tests. Regression models were fit to confirm conditional effects of detail of explanation.</p><p><strong>Results: </strong>The detail of explanation provided regarding the AI/CDS tools was positively related to respondents' comfort and confidence in the usage of the tools and their perception of the physician's final decision. The effects of detail of explanation on their perception of the physician's final decision were different given the AI/CDS result and the physician's agreement or disagreement with the result.</p><p><strong>Conclusions: </strong>More information provided by physicians regarding the use of AI/CDS tools may improve patient attitudes toward healthcare involving AI/CDS tools in general and in certain contexts of the AI/CDS result and physician agreement.</p>","PeriodicalId":16317,"journal":{"name":"Journal of Medical Ethics","volume":" ","pages":""},"PeriodicalIF":3.3000,"publicationDate":"2024-12-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Medical Ethics","FirstCategoryId":"98","ListUrlMain":"https://doi.org/10.1136/jme-2024-110080","RegionNum":2,"RegionCategory":"哲学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ETHICS","Score":null,"Total":0}
引用次数: 0
Abstract
Background: It is increasingly recognised that the success of artificial intelligence-based clinical decision support (AI/CDS) tools will depend on physician and patient trust, but factors impacting patients' views on clinical care reliant on AI have been less explored.
Objective: This pilot study explores whether, and in what contexts, detail of explanation provided about AI/CDS tools impacts patients' attitudes toward the tools and their clinical care.
Methods: We designed a Sequential Multiple Assignment Randomized Trial vignette web-based survey. Participants recruited through Amazon Mechanical Turk were presented with hypothetical vignettes describing health concerns and were sequentially randomised along three factors: (1) the level of detail of explanation regarding an AI/CDS tool; (2) the AI/CDS result; and (3) the physician's level of agreement with the AI/CDS result. We compared mean ratings of comfort and confidence by the level of detail of explanation using t-tests. Regression models were fit to confirm conditional effects of detail of explanation.
Results: The detail of explanation provided regarding the AI/CDS tools was positively related to respondents' comfort and confidence in the usage of the tools and their perception of the physician's final decision. The effects of detail of explanation on their perception of the physician's final decision were different given the AI/CDS result and the physician's agreement or disagreement with the result.
Conclusions: More information provided by physicians regarding the use of AI/CDS tools may improve patient attitudes toward healthcare involving AI/CDS tools in general and in certain contexts of the AI/CDS result and physician agreement.
期刊介绍:
Journal of Medical Ethics is a leading international journal that reflects the whole field of medical ethics. The journal seeks to promote ethical reflection and conduct in scientific research and medical practice. It features articles on various ethical aspects of health care relevant to health care professionals, members of clinical ethics committees, medical ethics professionals, researchers and bioscientists, policy makers and patients.
Subscribers to the Journal of Medical Ethics also receive Medical Humanities journal at no extra cost.
JME is the official journal of the Institute of Medical Ethics.