Jessica R Ellis, Natalia S Dellavalle, Mika K Hamer, Marlee Akerson, Matt Andazola, Annie A Moore, Eric G Campbell, Matthew DeCamp
{"title":"The Halo Effect: Perceptions of Information Privacy Among Healthcare Chatbot Users.","authors":"Jessica R Ellis, Natalia S Dellavalle, Mika K Hamer, Marlee Akerson, Matt Andazola, Annie A Moore, Eric G Campbell, Matthew DeCamp","doi":"10.1111/jgs.19393","DOIUrl":null,"url":null,"abstract":"<p><strong>Background: </strong>Patient-facing chatbots can be used for administrative tasks, personalized care reminders, and overcoming transportation or geographic barriers in healthcare. Although some data suggest older adults see privacy as an ethical barrier to adopting digital technologies, little is known about privacy concerns regarding information shared with novel patient-facing chatbots. We sought to examine attitudes toward privacy based on age or other sociodemographic characteristics.</p><p><strong>Methods: </strong>We conducted a sequential mixed methods study among patient users of a large healthcare system chatbot. We purposively oversampled by race and ethnicity to survey 3089 patient chatbot users online using de novo and validated items. Next, we conducted semi-structured interviews with users (n = 46) purposively sampled based on diversity or select survey responses. We used multivariable logistic regression to analyze survey data and modified grounded theory to analyze interviews. We integrated data using simultaneous visualization and triangulation.</p><p><strong>Results: </strong>We received 617/3089 surveys (response rate, 20.0%). Overall, 370/597 (63.9%) expressed worry about the privacy of information shared with the chatbot. Logistic regression found that users ≥ 65 years were 26% points less likely to be worried about information privacy compared to those 18-34 years old (p < 0.001). We found less worry among Black, non-Hispanic users and among those with more than a four-year college degree. By integrating our survey and interview data, we observed that older adult users experienced a halo effect: they worried less because they saw the chatbot as associated with a trusted health system and experienced lower medical mistrust.</p><p><strong>Conclusion: </strong>Contrary to some prior research, adults aged 65 and older expressed less concern about chatbot privacy than younger adults because of their trust in health care. To maintain this trust and build it among all users, health systems using patient-facing chatbots need to take active steps to maintain and communicate patient privacy protections.</p>","PeriodicalId":94112,"journal":{"name":"Journal of the American Geriatrics Society","volume":" ","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2025-02-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of the American Geriatrics Society","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1111/jgs.19393","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Background: Patient-facing chatbots can be used for administrative tasks, personalized care reminders, and overcoming transportation or geographic barriers in healthcare. Although some data suggest older adults see privacy as an ethical barrier to adopting digital technologies, little is known about privacy concerns regarding information shared with novel patient-facing chatbots. We sought to examine attitudes toward privacy based on age or other sociodemographic characteristics.
Methods: We conducted a sequential mixed methods study among patient users of a large healthcare system chatbot. We purposively oversampled by race and ethnicity to survey 3089 patient chatbot users online using de novo and validated items. Next, we conducted semi-structured interviews with users (n = 46) purposively sampled based on diversity or select survey responses. We used multivariable logistic regression to analyze survey data and modified grounded theory to analyze interviews. We integrated data using simultaneous visualization and triangulation.
Results: We received 617/3089 surveys (response rate, 20.0%). Overall, 370/597 (63.9%) expressed worry about the privacy of information shared with the chatbot. Logistic regression found that users ≥ 65 years were 26% points less likely to be worried about information privacy compared to those 18-34 years old (p < 0.001). We found less worry among Black, non-Hispanic users and among those with more than a four-year college degree. By integrating our survey and interview data, we observed that older adult users experienced a halo effect: they worried less because they saw the chatbot as associated with a trusted health system and experienced lower medical mistrust.
Conclusion: Contrary to some prior research, adults aged 65 and older expressed less concern about chatbot privacy than younger adults because of their trust in health care. To maintain this trust and build it among all users, health systems using patient-facing chatbots need to take active steps to maintain and communicate patient privacy protections.