Tom Nadarzynski, Nicky Knights, Deborah Husbands, Cynthia Graham, Carrie D Llewellyn, Tom Buchanan, Ian Montgomery, Alejandra Soruco Rodriguez, Chimeremumma Ogueri, Nidhi Singh, Evan Rouse, Olabisi Oyebode, Ankit Das, Grace Paydon, Gurpreet Lall, Anathoth Bulukungu, Nur Yanyali, Alexandra Stefan, Damien Ridge
{"title":"聊天机器人辅助自我评估(CASA):为少数民族共同设计人工智能驱动的行为改变干预。","authors":"Tom Nadarzynski, Nicky Knights, Deborah Husbands, Cynthia Graham, Carrie D Llewellyn, Tom Buchanan, Ian Montgomery, Alejandra Soruco Rodriguez, Chimeremumma Ogueri, Nidhi Singh, Evan Rouse, Olabisi Oyebode, Ankit Das, Grace Paydon, Gurpreet Lall, Anathoth Bulukungu, Nur Yanyali, Alexandra Stefan, Damien Ridge","doi":"10.1371/journal.pdig.0000724","DOIUrl":null,"url":null,"abstract":"<p><strong>Background: </strong>The digitalisation of healthcare has provided new ways to address disparities in sexual health outcomes that particularly affect ethnic and sexual minorities. Conversational artificial intelligence (AI) chatbots can provide personalised health education and refer users for appropriate medical consultations. We aimed to explore design principles of a chatbot-assisted culturally sensitive self-assessment intervention based on the disclosure of health-related information.</p><p><strong>Methods: </strong>In 2022, an online survey was conducted among an ethnically diverse UK sample (N = 1,287) to identify the level and type of health-related information disclosure to sexual health chatbots, and reactions to chatbots' risk appraisal. Follow-up interviews (N = 41) further explored perceptions of chatbot-led health assessment to identify aspects related to acceptability and utilisation. Datasets were analysed using one-way ANOVAs, linear regression, and thematic analysis.</p><p><strong>Results: </strong>Participants had neutral-to-positive attitudes towards chatbots and were comfortable disclosing demographic and sensitive health information. Chatbot awareness, previous experience and positive attitudes towards chatbots predicted information disclosure. Qualitatively, four main themes were identified: \"Chatbot as an artificial health advisor\", \"Disclosing information to a chatbot\", \"Ways to facilitate trust and disclosure\", and \"Acting on self-assessment\".</p><p><strong>Conclusion: </strong>Chatbots were acceptable for health self-assessment among this sample of ethnically diverse individuals. Most users reported being comfortable disclosing sensitive and personal information, but user anonymity is key to engagement with chatbots. As this technology becomes more advanced and widely available, chatbots could potentially become supplementary tools for health education and screening eligibility assessment. Future research is needed to establish their impact on screening uptake and access to health services among minoritised communities.</p>","PeriodicalId":74465,"journal":{"name":"PLOS digital health","volume":"4 2","pages":"e0000724"},"PeriodicalIF":0.0000,"publicationDate":"2025-02-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11824973/pdf/","citationCount":"0","resultStr":"{\"title\":\"Chatbot -assisted self-assessment (CASA): Co-designing an AI -powered behaviour change intervention for ethnic minorities.\",\"authors\":\"Tom Nadarzynski, Nicky Knights, Deborah Husbands, Cynthia Graham, Carrie D Llewellyn, Tom Buchanan, Ian Montgomery, Alejandra Soruco Rodriguez, Chimeremumma Ogueri, Nidhi Singh, Evan Rouse, Olabisi Oyebode, Ankit Das, Grace Paydon, Gurpreet Lall, Anathoth Bulukungu, Nur Yanyali, Alexandra Stefan, Damien Ridge\",\"doi\":\"10.1371/journal.pdig.0000724\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><strong>Background: </strong>The digitalisation of healthcare has provided new ways to address disparities in sexual health outcomes that particularly affect ethnic and sexual minorities. Conversational artificial intelligence (AI) chatbots can provide personalised health education and refer users for appropriate medical consultations. We aimed to explore design principles of a chatbot-assisted culturally sensitive self-assessment intervention based on the disclosure of health-related information.</p><p><strong>Methods: </strong>In 2022, an online survey was conducted among an ethnically diverse UK sample (N = 1,287) to identify the level and type of health-related information disclosure to sexual health chatbots, and reactions to chatbots' risk appraisal. Follow-up interviews (N = 41) further explored perceptions of chatbot-led health assessment to identify aspects related to acceptability and utilisation. Datasets were analysed using one-way ANOVAs, linear regression, and thematic analysis.</p><p><strong>Results: </strong>Participants had neutral-to-positive attitudes towards chatbots and were comfortable disclosing demographic and sensitive health information. Chatbot awareness, previous experience and positive attitudes towards chatbots predicted information disclosure. Qualitatively, four main themes were identified: \\\"Chatbot as an artificial health advisor\\\", \\\"Disclosing information to a chatbot\\\", \\\"Ways to facilitate trust and disclosure\\\", and \\\"Acting on self-assessment\\\".</p><p><strong>Conclusion: </strong>Chatbots were acceptable for health self-assessment among this sample of ethnically diverse individuals. Most users reported being comfortable disclosing sensitive and personal information, but user anonymity is key to engagement with chatbots. As this technology becomes more advanced and widely available, chatbots could potentially become supplementary tools for health education and screening eligibility assessment. Future research is needed to establish their impact on screening uptake and access to health services among minoritised communities.</p>\",\"PeriodicalId\":74465,\"journal\":{\"name\":\"PLOS digital health\",\"volume\":\"4 2\",\"pages\":\"e0000724\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2025-02-13\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11824973/pdf/\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"PLOS digital health\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1371/journal.pdig.0000724\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"2025/2/1 0:00:00\",\"PubModel\":\"eCollection\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"PLOS digital health","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1371/journal.pdig.0000724","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2025/2/1 0:00:00","PubModel":"eCollection","JCR":"","JCRName":"","Score":null,"Total":0}
Chatbot -assisted self-assessment (CASA): Co-designing an AI -powered behaviour change intervention for ethnic minorities.
Background: The digitalisation of healthcare has provided new ways to address disparities in sexual health outcomes that particularly affect ethnic and sexual minorities. Conversational artificial intelligence (AI) chatbots can provide personalised health education and refer users for appropriate medical consultations. We aimed to explore design principles of a chatbot-assisted culturally sensitive self-assessment intervention based on the disclosure of health-related information.
Methods: In 2022, an online survey was conducted among an ethnically diverse UK sample (N = 1,287) to identify the level and type of health-related information disclosure to sexual health chatbots, and reactions to chatbots' risk appraisal. Follow-up interviews (N = 41) further explored perceptions of chatbot-led health assessment to identify aspects related to acceptability and utilisation. Datasets were analysed using one-way ANOVAs, linear regression, and thematic analysis.
Results: Participants had neutral-to-positive attitudes towards chatbots and were comfortable disclosing demographic and sensitive health information. Chatbot awareness, previous experience and positive attitudes towards chatbots predicted information disclosure. Qualitatively, four main themes were identified: "Chatbot as an artificial health advisor", "Disclosing information to a chatbot", "Ways to facilitate trust and disclosure", and "Acting on self-assessment".
Conclusion: Chatbots were acceptable for health self-assessment among this sample of ethnically diverse individuals. Most users reported being comfortable disclosing sensitive and personal information, but user anonymity is key to engagement with chatbots. As this technology becomes more advanced and widely available, chatbots could potentially become supplementary tools for health education and screening eligibility assessment. Future research is needed to establish their impact on screening uptake and access to health services among minoritised communities.