{"title":"Trust in Artificial Intelligence in Wound Care: Perspectives of Healthcare Professionals and Patients in Germany.","authors":"Birgit Babitsch, Niels Hannemann, Ursula Hübner","doi":"10.3233/SHTI251514","DOIUrl":null,"url":null,"abstract":"<p><p>Artificial intelligence (AI) is increasingly integrated into healthcare, changing processes and structures, and thus the practice of healthcare professionals and potentially the role of patients and the healthcare professional-patient relationship. Beyond high-precision AI algorithms, knowledge of how to evaluate and use AI-based results in everyday healthcare is crucial for high-quality and safe care, and a prerequisite for trust. Therefore, this qualitative study aims to explore 1) the general perception of trust in AI used in healthcare and specifically in wound care, 2) the prerequisites for building trust in AI, and 3) the impact of AI on treatment and healthcare professional-patient relationship, all from the perspective of healthcare professionals and patients. Interviews were conducted in 2022/2023 with healthcare professionals specializing in wound care (N = 12) and in 2023 with patients with chronic wounds (N = 10). The interview guide included questions about digitalization in general and AI in particular, as well as trust and the healthcare professional-patient relationship. Our data revealed a limited understanding of AI principles and evaluation of AI-generated outcomes in both groups. Healthcare professionals recognized the potential of AI to provide data-driven suggestions for diagnosis and therapy, acting as a supportive \"second opinion\". Patients, on the contrary, expressed a preference for their physicians to incorporate AI-generated results into their care, thereby placing their trust in the physician's ability to apply them correctly. Neither group expected significant changes in the healthcare professional-patient relationship. Trust in AI was linked to general trust in digitalization, and healthcare professionals showed greater trust in AI results that were aligned with their existing expertise and were transparently explained. These findings suggest that AI can be a valuable tool for high-quality healthcare, but in-formed use requires meeting key prerequisites, including Explainable AI (XAI) principles and ongoing training.</p>","PeriodicalId":94357,"journal":{"name":"Studies in health technology and informatics","volume":"332 ","pages":"144-148"},"PeriodicalIF":0.0000,"publicationDate":"2025-10-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Studies in health technology and informatics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.3233/SHTI251514","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Artificial intelligence (AI) is increasingly integrated into healthcare, changing processes and structures, and thus the practice of healthcare professionals and potentially the role of patients and the healthcare professional-patient relationship. Beyond high-precision AI algorithms, knowledge of how to evaluate and use AI-based results in everyday healthcare is crucial for high-quality and safe care, and a prerequisite for trust. Therefore, this qualitative study aims to explore 1) the general perception of trust in AI used in healthcare and specifically in wound care, 2) the prerequisites for building trust in AI, and 3) the impact of AI on treatment and healthcare professional-patient relationship, all from the perspective of healthcare professionals and patients. Interviews were conducted in 2022/2023 with healthcare professionals specializing in wound care (N = 12) and in 2023 with patients with chronic wounds (N = 10). The interview guide included questions about digitalization in general and AI in particular, as well as trust and the healthcare professional-patient relationship. Our data revealed a limited understanding of AI principles and evaluation of AI-generated outcomes in both groups. Healthcare professionals recognized the potential of AI to provide data-driven suggestions for diagnosis and therapy, acting as a supportive "second opinion". Patients, on the contrary, expressed a preference for their physicians to incorporate AI-generated results into their care, thereby placing their trust in the physician's ability to apply them correctly. Neither group expected significant changes in the healthcare professional-patient relationship. Trust in AI was linked to general trust in digitalization, and healthcare professionals showed greater trust in AI results that were aligned with their existing expertise and were transparently explained. These findings suggest that AI can be a valuable tool for high-quality healthcare, but in-formed use requires meeting key prerequisites, including Explainable AI (XAI) principles and ongoing training.