聊天机器人辅助自我评估(CASA):为少数民族共同设计人工智能驱动的行为改变干预。

PLOS digital health Pub Date : 2025-02-13 eCollection Date: 2025-02-01 DOI:10.1371/journal.pdig.0000724
Tom Nadarzynski, Nicky Knights, Deborah Husbands, Cynthia Graham, Carrie D Llewellyn, Tom Buchanan, Ian Montgomery, Alejandra Soruco Rodriguez, Chimeremumma Ogueri, Nidhi Singh, Evan Rouse, Olabisi Oyebode, Ankit Das, Grace Paydon, Gurpreet Lall, Anathoth Bulukungu, Nur Yanyali, Alexandra Stefan, Damien Ridge
{"title":"聊天机器人辅助自我评估(CASA):为少数民族共同设计人工智能驱动的行为改变干预。","authors":"Tom Nadarzynski, Nicky Knights, Deborah Husbands, Cynthia Graham, Carrie D Llewellyn, Tom Buchanan, Ian Montgomery, Alejandra Soruco Rodriguez, Chimeremumma Ogueri, Nidhi Singh, Evan Rouse, Olabisi Oyebode, Ankit Das, Grace Paydon, Gurpreet Lall, Anathoth Bulukungu, Nur Yanyali, Alexandra Stefan, Damien Ridge","doi":"10.1371/journal.pdig.0000724","DOIUrl":null,"url":null,"abstract":"<p><strong>Background: </strong>The digitalisation of healthcare has provided new ways to address disparities in sexual health outcomes that particularly affect ethnic and sexual minorities. Conversational artificial intelligence (AI) chatbots can provide personalised health education and refer users for appropriate medical consultations. We aimed to explore design principles of a chatbot-assisted culturally sensitive self-assessment intervention based on the disclosure of health-related information.</p><p><strong>Methods: </strong>In 2022, an online survey was conducted among an ethnically diverse UK sample (N = 1,287) to identify the level and type of health-related information disclosure to sexual health chatbots, and reactions to chatbots' risk appraisal. Follow-up interviews (N = 41) further explored perceptions of chatbot-led health assessment to identify aspects related to acceptability and utilisation. Datasets were analysed using one-way ANOVAs, linear regression, and thematic analysis.</p><p><strong>Results: </strong>Participants had neutral-to-positive attitudes towards chatbots and were comfortable disclosing demographic and sensitive health information. Chatbot awareness, previous experience and positive attitudes towards chatbots predicted information disclosure. Qualitatively, four main themes were identified: \"Chatbot as an artificial health advisor\", \"Disclosing information to a chatbot\", \"Ways to facilitate trust and disclosure\", and \"Acting on self-assessment\".</p><p><strong>Conclusion: </strong>Chatbots were acceptable for health self-assessment among this sample of ethnically diverse individuals. Most users reported being comfortable disclosing sensitive and personal information, but user anonymity is key to engagement with chatbots. As this technology becomes more advanced and widely available, chatbots could potentially become supplementary tools for health education and screening eligibility assessment. Future research is needed to establish their impact on screening uptake and access to health services among minoritised communities.</p>","PeriodicalId":74465,"journal":{"name":"PLOS digital health","volume":"4 2","pages":"e0000724"},"PeriodicalIF":0.0000,"publicationDate":"2025-02-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11824973/pdf/","citationCount":"0","resultStr":"{\"title\":\"Chatbot -assisted self-assessment (CASA): Co-designing an AI -powered behaviour change intervention for ethnic minorities.\",\"authors\":\"Tom Nadarzynski, Nicky Knights, Deborah Husbands, Cynthia Graham, Carrie D Llewellyn, Tom Buchanan, Ian Montgomery, Alejandra Soruco Rodriguez, Chimeremumma Ogueri, Nidhi Singh, Evan Rouse, Olabisi Oyebode, Ankit Das, Grace Paydon, Gurpreet Lall, Anathoth Bulukungu, Nur Yanyali, Alexandra Stefan, Damien Ridge\",\"doi\":\"10.1371/journal.pdig.0000724\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><strong>Background: </strong>The digitalisation of healthcare has provided new ways to address disparities in sexual health outcomes that particularly affect ethnic and sexual minorities. Conversational artificial intelligence (AI) chatbots can provide personalised health education and refer users for appropriate medical consultations. We aimed to explore design principles of a chatbot-assisted culturally sensitive self-assessment intervention based on the disclosure of health-related information.</p><p><strong>Methods: </strong>In 2022, an online survey was conducted among an ethnically diverse UK sample (N = 1,287) to identify the level and type of health-related information disclosure to sexual health chatbots, and reactions to chatbots' risk appraisal. Follow-up interviews (N = 41) further explored perceptions of chatbot-led health assessment to identify aspects related to acceptability and utilisation. Datasets were analysed using one-way ANOVAs, linear regression, and thematic analysis.</p><p><strong>Results: </strong>Participants had neutral-to-positive attitudes towards chatbots and were comfortable disclosing demographic and sensitive health information. Chatbot awareness, previous experience and positive attitudes towards chatbots predicted information disclosure. Qualitatively, four main themes were identified: \\\"Chatbot as an artificial health advisor\\\", \\\"Disclosing information to a chatbot\\\", \\\"Ways to facilitate trust and disclosure\\\", and \\\"Acting on self-assessment\\\".</p><p><strong>Conclusion: </strong>Chatbots were acceptable for health self-assessment among this sample of ethnically diverse individuals. Most users reported being comfortable disclosing sensitive and personal information, but user anonymity is key to engagement with chatbots. As this technology becomes more advanced and widely available, chatbots could potentially become supplementary tools for health education and screening eligibility assessment. Future research is needed to establish their impact on screening uptake and access to health services among minoritised communities.</p>\",\"PeriodicalId\":74465,\"journal\":{\"name\":\"PLOS digital health\",\"volume\":\"4 2\",\"pages\":\"e0000724\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2025-02-13\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11824973/pdf/\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"PLOS digital health\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1371/journal.pdig.0000724\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"2025/2/1 0:00:00\",\"PubModel\":\"eCollection\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"PLOS digital health","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1371/journal.pdig.0000724","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2025/2/1 0:00:00","PubModel":"eCollection","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

背景:医疗保健数字化为解决特别是影响少数民族和性少数群体的性健康结果差异提供了新的途径。会话式人工智能(AI)聊天机器人可以提供个性化的健康教育,并为用户推荐适当的医疗咨询。我们旨在探索基于健康相关信息披露的聊天机器人辅助文化敏感自我评估干预的设计原则。方法:在2022年,对英国不同种族的样本(N = 1,287)进行了一项在线调查,以确定对性健康聊天机器人的健康信息披露水平和类型,以及对聊天机器人风险评估的反应。后续访谈(N = 41)进一步探讨了对聊天机器人主导的健康评估的看法,以确定与可接受性和利用率相关的方面。使用单因素方差分析、线性回归和专题分析对数据集进行分析。结果:参与者对聊天机器人持中立到积极的态度,并且愿意透露人口统计和敏感的健康信息。对聊天机器人的认知、之前的经验和对聊天机器人的积极态度预测了信息泄露。定性地说,确定了四个主要主题:“聊天机器人作为人工健康顾问”、“向聊天机器人披露信息”、“促进信任和披露的方法”和“根据自我评估采取行动”。结论:聊天机器人在不同种族个体的健康自我评估中是可接受的。大多数用户表示,他们愿意透露敏感和个人信息,但用户匿名是与聊天机器人互动的关键。随着这项技术变得更加先进和广泛使用,聊天机器人可能会成为健康教育和筛查资格评估的补充工具。未来需要进行研究,以确定它们对少数群体接受筛查和获得保健服务的影响。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Chatbot -assisted self-assessment (CASA): Co-designing an AI -powered behaviour change intervention for ethnic minorities.

Background: The digitalisation of healthcare has provided new ways to address disparities in sexual health outcomes that particularly affect ethnic and sexual minorities. Conversational artificial intelligence (AI) chatbots can provide personalised health education and refer users for appropriate medical consultations. We aimed to explore design principles of a chatbot-assisted culturally sensitive self-assessment intervention based on the disclosure of health-related information.

Methods: In 2022, an online survey was conducted among an ethnically diverse UK sample (N = 1,287) to identify the level and type of health-related information disclosure to sexual health chatbots, and reactions to chatbots' risk appraisal. Follow-up interviews (N = 41) further explored perceptions of chatbot-led health assessment to identify aspects related to acceptability and utilisation. Datasets were analysed using one-way ANOVAs, linear regression, and thematic analysis.

Results: Participants had neutral-to-positive attitudes towards chatbots and were comfortable disclosing demographic and sensitive health information. Chatbot awareness, previous experience and positive attitudes towards chatbots predicted information disclosure. Qualitatively, four main themes were identified: "Chatbot as an artificial health advisor", "Disclosing information to a chatbot", "Ways to facilitate trust and disclosure", and "Acting on self-assessment".

Conclusion: Chatbots were acceptable for health self-assessment among this sample of ethnically diverse individuals. Most users reported being comfortable disclosing sensitive and personal information, but user anonymity is key to engagement with chatbots. As this technology becomes more advanced and widely available, chatbots could potentially become supplementary tools for health education and screening eligibility assessment. Future research is needed to establish their impact on screening uptake and access to health services among minoritised communities.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信