Ian Kwok MD, Kimberly Murdaugh MD MS, Daniel Shalev MD, Walter Boot PhD, Cary Reid MD PhD, Ronald Adelman MD
{"title":"New Frontiers in Artificial Intelligence: A Multimodal Communication Model","authors":"Ian Kwok MD, Kimberly Murdaugh MD MS, Daniel Shalev MD, Walter Boot PhD, Cary Reid MD PhD, Ronald Adelman MD","doi":"10.1016/j.jpainsymman.2025.02.070","DOIUrl":null,"url":null,"abstract":"<div><h3>Outcomes</h3><div>1. Participants will be able to evaluate a new, patient-facing application of artificial intelligence to improve information-sharing in family meetings.</div><div>2. Participants will be able to apply a new model of multimodality to advance communication science in palliative care research.</div></div><div><h3>Key Message</h3><div>Verbal communication can be augmented by a multimodal approach. In the first use of artificial intelligence for patient-facing communication, we developed a prototype capable of generating real-time transcriptions of family meetings. Applying user-centered design, artificial intelligence can complement and advance communication science in palliative care settings.</div></div><div><h3>Abstract</h3><div>Patients and families face complex communication challenges in the setting of serious illness. These difficulties are compounded for individuals who are non-English-speaking, have low health literacy, have visual/auditory impairments, or possess unique learning styles. Improving accessibility for these patients and caregivers is paramount.</div></div><div><h3>Objectives</h3><div>We hypothesize that augmenting standard verbal communication modalities with a multimodal communication approach (integrating written, visual, and other modalities) can improve medical information-sharing.</div></div><div><h3>Methods</h3><div>In the first use of artificial intelligence for patient-facing communication, we developed a prototype capable of generating real-time transcribed reports of family meetings for patients, their caregivers, and other clinicians. The prototype uses an iPhone for audio recording, followed by the use of machine learning, natural language processing, and speaker diarization modules to create a written transcript, which accurately integrates medical terminology, translates between multiple languages, and distinguishes between a large number of different speakers. This transcript can then be physically shared and digitally uploaded to the electronic medical record. To inform the development of this invention, we implemented a user-centered design process based on patient, caregiver, and clinician interviews (in process). We also conducted an interdisciplinary literature review exploring the use of multimodality for communication spanning medical, educational, and social science research.</div></div><div><h3>Results</h3><div>Our results will be presented in a multimodal format, including audiovisual and written components. Using a professionally filmed video, we will perform a full demonstration of the prototype (depicting a simulated family meeting) to illustrate usage recommendations and to demystify its technological components. We will also elucidate our proposed model of multimodal communication, which offers insight into future opportunities and analyses.</div></div><div><h3>Conclusion</h3><div>The use of artificial intelligence to actualize multimodality in family meetings is opening the door to a variety of possibilities for the advancement of communication science in palliative care research.</div></div><div><h3>References</h3><div>Sarmet M, Kabani A, Coelho L, et al. The use of natural language processing in palliative care research: A scoping review. Palliat Med. 2023 Feb;37(2):275-290. doi: 10.1177/02692163221141969. Epub 2022 Dec 10. PMID: 36495082. Tarbi EC, Blanch-Hartigan D, van Vliet LM, et al. Toward a basic science of communication in serious illness. Patient Educ Couns. 2022 Jul;105(7):1963-1969. doi: 10.1016/j.pec.2022.03.019. <span><span>https://pubmed.ncbi.nlm.nih.gov/35410737/</span><svg><path></path></svg></span></div></div>","PeriodicalId":16634,"journal":{"name":"Journal of pain and symptom management","volume":"69 5","pages":"Pages e454-e455"},"PeriodicalIF":3.2000,"publicationDate":"2025-04-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of pain and symptom management","FirstCategoryId":"3","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0885392425001307","RegionNum":2,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"CLINICAL NEUROLOGY","Score":null,"Total":0}
引用次数: 0
Outcomes
1. Participants will be able to evaluate a new, patient-facing application of artificial intelligence to improve information-sharing in family meetings.
2. Participants will be able to apply a new model of multimodality to advance communication science in palliative care research.
Key Message
Verbal communication can be augmented by a multimodal approach. In the first use of artificial intelligence for patient-facing communication, we developed a prototype capable of generating real-time transcriptions of family meetings. Applying user-centered design, artificial intelligence can complement and advance communication science in palliative care settings.
Abstract
Patients and families face complex communication challenges in the setting of serious illness. These difficulties are compounded for individuals who are non-English-speaking, have low health literacy, have visual/auditory impairments, or possess unique learning styles. Improving accessibility for these patients and caregivers is paramount.
Objectives
We hypothesize that augmenting standard verbal communication modalities with a multimodal communication approach (integrating written, visual, and other modalities) can improve medical information-sharing.
Methods
In the first use of artificial intelligence for patient-facing communication, we developed a prototype capable of generating real-time transcribed reports of family meetings for patients, their caregivers, and other clinicians. The prototype uses an iPhone for audio recording, followed by the use of machine learning, natural language processing, and speaker diarization modules to create a written transcript, which accurately integrates medical terminology, translates between multiple languages, and distinguishes between a large number of different speakers. This transcript can then be physically shared and digitally uploaded to the electronic medical record. To inform the development of this invention, we implemented a user-centered design process based on patient, caregiver, and clinician interviews (in process). We also conducted an interdisciplinary literature review exploring the use of multimodality for communication spanning medical, educational, and social science research.
Results
Our results will be presented in a multimodal format, including audiovisual and written components. Using a professionally filmed video, we will perform a full demonstration of the prototype (depicting a simulated family meeting) to illustrate usage recommendations and to demystify its technological components. We will also elucidate our proposed model of multimodal communication, which offers insight into future opportunities and analyses.
Conclusion
The use of artificial intelligence to actualize multimodality in family meetings is opening the door to a variety of possibilities for the advancement of communication science in palliative care research.
References
Sarmet M, Kabani A, Coelho L, et al. The use of natural language processing in palliative care research: A scoping review. Palliat Med. 2023 Feb;37(2):275-290. doi: 10.1177/02692163221141969. Epub 2022 Dec 10. PMID: 36495082. Tarbi EC, Blanch-Hartigan D, van Vliet LM, et al. Toward a basic science of communication in serious illness. Patient Educ Couns. 2022 Jul;105(7):1963-1969. doi: 10.1016/j.pec.2022.03.019. https://pubmed.ncbi.nlm.nih.gov/35410737/
期刊介绍:
The Journal of Pain and Symptom Management is an internationally respected, peer-reviewed journal and serves an interdisciplinary audience of professionals by providing a forum for the publication of the latest clinical research and best practices related to the relief of illness burden among patients afflicted with serious or life-threatening illness.