The Risks of Artificial Intelligence in Mental Health Care

IF 2.6 4区 医学 Q1 NURSING
Ceylon Dell
{"title":"The Risks of Artificial Intelligence in Mental Health Care","authors":"Ceylon Dell","doi":"10.1111/jpm.13119","DOIUrl":null,"url":null,"abstract":"<p>Artificial intelligence (AI) has increasingly integrated into various aspects of healthcare, including diagnostics, care planning and patient management. In psychiatric healthcare specifically, conversational AI is seen as a potential solution to support psychiatric nurses in the assessment of psychiatric illnesses (Rebelo, Verboom, and Santos <span>2023</span>). However, it is important that the oversight of a psychiatric nurse is needed when integrating conversational AI into psychiatric care. This editorial explores the risk of using conversational AI for psychiatric assessments and emphasises the need for human intervention in AI-driven processes.</p><p>In community mental health settings, psychiatric nurses play a crucial role in managing a wide range of conditions. The increasing demand for care, driven by factors such as heightened public awareness and reduction of stigmatisation for psychiatric illnesses, has led to longer waiting times for services (British Medical Association <span>2024</span>). Conversational AI systems have been proposed to alleviate this pressure by streamlining the psychiatric assessment process (Rollwage et al. <span>2024</span>). One of the key benefits of AI is its potential to triage and prioritise those with the most urgent needs, thereby ensuring that critical cases are addressed promptly (Lu et al. <span>2023</span>).</p><p>Psychiatric assessments require a deep understanding of the patient's presentation, psychiatric symptoms and the context of patient behaviour. Conversational AI systems are based on language learning models (LLM) and analyse data they have been trained on previously, which is mainly in written format and often derived from patients' electronic health records initially, advises Yang et al. (<span>2022</span>). Electronic health records contain all patient notes relevant to the episode of care the patient is receiving. It can include quantitative data sets such as diagnoses, charts, patient demographics and patient-led questionnaires based on symptomology or more nuanced qualitative data sets, such as psychiatric nursing observations of patient behaviour and caregiver views (Goldstein et al. <span>2022</span>). These data points help to construct a picture of a patient's presentation as part of an assessment before a patient is spoken to. Indeed, the psychiatric nurse will often review referrals, previous history and information to gain some knowledge of the patient before assessing the needs for that episode of care. However, an AI could face significant challenges in interpreting patient information with the depth and nuance that a psychiatric nurse could, suggests Elyoseph, Levkovich, and Shinan-Altman (<span>2024</span>), potentially resulting in inaccuracies in assessment, which can lead to inaccurate severity assessment and poor treatment outcomes.</p><p>The notion of bias in AI systems is a significant issue in psychiatric care, particularly because these biases often originate from the clinicians themselves Meidert et al. (<span>2023</span>). As psychiatric nurses and other clinicians input information into electronic medical records, their own perceptions and potential biases towards the patient's culture and language, for example, can influence the information recorded, thus affecting training data for AI systems which are required to analyse a patient's psychiatric symptoms. In this way, the human bias impacts the AI bias. Particularly in mental health settings, where patients experience psychiatric symptoms in different ways, it is important to have accurate psychiatric assessment. For example, a conversational AI developed predominantly with patient data from Western healthcare settings could fail to interpret an observation from a psychiatric nurse and written expressions of psychiatric distress from the patient themselves in non-Western cultures (Teye et al. <span>2022</span>).</p><p>Psychiatric assessments involve more than recording symptoms and patient history. They require a range of skills such as empathetic listening, observation and effective communication, which are crucial for fostering trust and understanding. These skills help establish a connection that is vital for effective treatment (Stiger-Outshoven et al. <span>2024</span>). However, without the psychiatric nurse skill set, AI technologies may reduce these complex interactions to data points, failing to capture the patient's unique psychiatric experiences. This reductionist approach by AI may also overlook emotional, behavioural and psychological signals, often conveyed through less overt means such as changes in mood, tone of voice or pauses in conversation. These nonverbal cues are normally observed in face-to-face interactions. One limitation of AI is that it is currently reliant on textual data. Therefore, observations are not documented in the electronic medical record; it might not understand, misanalyse or simply overlook observed nonverbal communications from the psychiatric patient. This could potentially lead to inadequate care, highlighting the essential role of psychiatric nurse oversight in psychiatric assessments, whose keen observations, knowledge and understanding of these subtle cues are critical to delivering effective psychiatric care. As AI continues to support all aspects of healthcare, its application in psychiatric assessments presents opportunities and challenges. While conversational AI systems offer promising solutions, it is the specialism of psychiatric nurses that enables the interpretation of complex human behaviours and the establishment of therapeutic relationships—this cannot yet be fully replicated by AI technology but can be embraced by it, suggests Nashwan et al. (<span>2023</span>).</p><p>To truly benefit from conversational AI in psychiatric settings, AI systems should be designed and trained to enhance the efficiency and accuracy of psychiatric assessments with the guidance and expertise of skilled psychiatric nurses. With this method, AI systems can complement the experience of the psychiatric nurse and enhance the patient experience in psychiatric care.</p><p>The author has nothing to report.</p><p>The author declares no conflicts of interest.</p>","PeriodicalId":50076,"journal":{"name":"Journal of Psychiatric and Mental Health Nursing","volume":"32 2","pages":"382-383"},"PeriodicalIF":2.6000,"publicationDate":"2024-09-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1111/jpm.13119","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Psychiatric and Mental Health Nursing","FirstCategoryId":"3","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1111/jpm.13119","RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"NURSING","Score":null,"Total":0}
引用次数: 0

Abstract

Artificial intelligence (AI) has increasingly integrated into various aspects of healthcare, including diagnostics, care planning and patient management. In psychiatric healthcare specifically, conversational AI is seen as a potential solution to support psychiatric nurses in the assessment of psychiatric illnesses (Rebelo, Verboom, and Santos 2023). However, it is important that the oversight of a psychiatric nurse is needed when integrating conversational AI into psychiatric care. This editorial explores the risk of using conversational AI for psychiatric assessments and emphasises the need for human intervention in AI-driven processes.

In community mental health settings, psychiatric nurses play a crucial role in managing a wide range of conditions. The increasing demand for care, driven by factors such as heightened public awareness and reduction of stigmatisation for psychiatric illnesses, has led to longer waiting times for services (British Medical Association 2024). Conversational AI systems have been proposed to alleviate this pressure by streamlining the psychiatric assessment process (Rollwage et al. 2024). One of the key benefits of AI is its potential to triage and prioritise those with the most urgent needs, thereby ensuring that critical cases are addressed promptly (Lu et al. 2023).

Psychiatric assessments require a deep understanding of the patient's presentation, psychiatric symptoms and the context of patient behaviour. Conversational AI systems are based on language learning models (LLM) and analyse data they have been trained on previously, which is mainly in written format and often derived from patients' electronic health records initially, advises Yang et al. (2022). Electronic health records contain all patient notes relevant to the episode of care the patient is receiving. It can include quantitative data sets such as diagnoses, charts, patient demographics and patient-led questionnaires based on symptomology or more nuanced qualitative data sets, such as psychiatric nursing observations of patient behaviour and caregiver views (Goldstein et al. 2022). These data points help to construct a picture of a patient's presentation as part of an assessment before a patient is spoken to. Indeed, the psychiatric nurse will often review referrals, previous history and information to gain some knowledge of the patient before assessing the needs for that episode of care. However, an AI could face significant challenges in interpreting patient information with the depth and nuance that a psychiatric nurse could, suggests Elyoseph, Levkovich, and Shinan-Altman (2024), potentially resulting in inaccuracies in assessment, which can lead to inaccurate severity assessment and poor treatment outcomes.

The notion of bias in AI systems is a significant issue in psychiatric care, particularly because these biases often originate from the clinicians themselves Meidert et al. (2023). As psychiatric nurses and other clinicians input information into electronic medical records, their own perceptions and potential biases towards the patient's culture and language, for example, can influence the information recorded, thus affecting training data for AI systems which are required to analyse a patient's psychiatric symptoms. In this way, the human bias impacts the AI bias. Particularly in mental health settings, where patients experience psychiatric symptoms in different ways, it is important to have accurate psychiatric assessment. For example, a conversational AI developed predominantly with patient data from Western healthcare settings could fail to interpret an observation from a psychiatric nurse and written expressions of psychiatric distress from the patient themselves in non-Western cultures (Teye et al. 2022).

Psychiatric assessments involve more than recording symptoms and patient history. They require a range of skills such as empathetic listening, observation and effective communication, which are crucial for fostering trust and understanding. These skills help establish a connection that is vital for effective treatment (Stiger-Outshoven et al. 2024). However, without the psychiatric nurse skill set, AI technologies may reduce these complex interactions to data points, failing to capture the patient's unique psychiatric experiences. This reductionist approach by AI may also overlook emotional, behavioural and psychological signals, often conveyed through less overt means such as changes in mood, tone of voice or pauses in conversation. These nonverbal cues are normally observed in face-to-face interactions. One limitation of AI is that it is currently reliant on textual data. Therefore, observations are not documented in the electronic medical record; it might not understand, misanalyse or simply overlook observed nonverbal communications from the psychiatric patient. This could potentially lead to inadequate care, highlighting the essential role of psychiatric nurse oversight in psychiatric assessments, whose keen observations, knowledge and understanding of these subtle cues are critical to delivering effective psychiatric care. As AI continues to support all aspects of healthcare, its application in psychiatric assessments presents opportunities and challenges. While conversational AI systems offer promising solutions, it is the specialism of psychiatric nurses that enables the interpretation of complex human behaviours and the establishment of therapeutic relationships—this cannot yet be fully replicated by AI technology but can be embraced by it, suggests Nashwan et al. (2023).

To truly benefit from conversational AI in psychiatric settings, AI systems should be designed and trained to enhance the efficiency and accuracy of psychiatric assessments with the guidance and expertise of skilled psychiatric nurses. With this method, AI systems can complement the experience of the psychiatric nurse and enhance the patient experience in psychiatric care.

The author has nothing to report.

The author declares no conflicts of interest.

人工智能在心理健康护理中的风险。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
4.70
自引率
3.70%
发文量
75
审稿时长
4-8 weeks
期刊介绍: The Journal of Psychiatric and Mental Health Nursing is an international journal which publishes research and scholarly papers that advance the development of policy, practice, research and education in all aspects of mental health nursing. We publish rigorously conducted research, literature reviews, essays and debates, and consumer practitioner narratives; all of which add new knowledge and advance practice globally. All papers must have clear implications for mental health nursing either solely or part of multidisciplinary practice. Papers are welcomed which draw on single or multiple research and academic disciplines. We give space to practitioner and consumer perspectives and ensure research published in the journal can be understood by a wide audience. We encourage critical debate and exchange of ideas and therefore welcome letters to the editor and essays and debates in mental health.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信