Sean Lauderdale, Sarah A Griffin, Kelli R Lahman, Eno Mbaba, Shealyn Tomlinson
{"title":"揭示边缘型人格障碍的公众耻辱:人工智能和精神卫生保健提供者的比较研究。","authors":"Sean Lauderdale, Sarah A Griffin, Kelli R Lahman, Eno Mbaba, Shealyn Tomlinson","doi":"10.1002/pmh.70018","DOIUrl":null,"url":null,"abstract":"<p><p>Generative artificial intelligence (GAI) programs can identify symptoms and make recommendations for treatment for mental disorders, including borderline personality disorder (BPD). Despite GAI's potential as a clinical tool, stereotypes are inherent in their algorithms but not obvious until directly assessed. Given this concern, we assessed and compared GAIs' (ChatGPT-3.5, 4, and Google Gemini) symptom recognition and public stigma for a woman and man vignette character with BPD. The GAIs' responses were also compared to a sample of mental health care practitioners (MHCPs; n = 218). Compared to MHCPs, GAI showed more empathy for the characters. GAI were also less likely to view the characters' mental health symptoms as developmental stage problems and rated these symptoms as more chronic and unchangeable. The GAI also rated the vignette characters as less trustworthy and more likely to have difficulty forming close relationships than the MHCPs. Across GAI, gender biases were found with Google Gemini showing less empathy, more negative reactions, and greater public stigma, particularly for a woman with BPD, than either ChatGPT-3.5 or ChatGPT-4. A woman with BPD was also rated as having more chronic mental health problems than a man by all GAI. Overall, these results suggest that GAI may express empathy but reflects gender bias and stereotyped beliefs for people with BPD. Greater transparency and incorporation of knowledgeable MHCPs and people with lived experiences are needed in GAI training to reduce bias and enhance their accuracy prior to use in mental health applications.</p>","PeriodicalId":46871,"journal":{"name":"Personality and Mental Health","volume":"19 2","pages":"e70018"},"PeriodicalIF":2.0000,"publicationDate":"2025-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Unveiling Public Stigma for Borderline Personality Disorder: A Comparative Study of Artificial Intelligence and Mental Health Care Providers.\",\"authors\":\"Sean Lauderdale, Sarah A Griffin, Kelli R Lahman, Eno Mbaba, Shealyn Tomlinson\",\"doi\":\"10.1002/pmh.70018\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>Generative artificial intelligence (GAI) programs can identify symptoms and make recommendations for treatment for mental disorders, including borderline personality disorder (BPD). Despite GAI's potential as a clinical tool, stereotypes are inherent in their algorithms but not obvious until directly assessed. Given this concern, we assessed and compared GAIs' (ChatGPT-3.5, 4, and Google Gemini) symptom recognition and public stigma for a woman and man vignette character with BPD. The GAIs' responses were also compared to a sample of mental health care practitioners (MHCPs; n = 218). Compared to MHCPs, GAI showed more empathy for the characters. GAI were also less likely to view the characters' mental health symptoms as developmental stage problems and rated these symptoms as more chronic and unchangeable. The GAI also rated the vignette characters as less trustworthy and more likely to have difficulty forming close relationships than the MHCPs. Across GAI, gender biases were found with Google Gemini showing less empathy, more negative reactions, and greater public stigma, particularly for a woman with BPD, than either ChatGPT-3.5 or ChatGPT-4. A woman with BPD was also rated as having more chronic mental health problems than a man by all GAI. Overall, these results suggest that GAI may express empathy but reflects gender bias and stereotyped beliefs for people with BPD. Greater transparency and incorporation of knowledgeable MHCPs and people with lived experiences are needed in GAI training to reduce bias and enhance their accuracy prior to use in mental health applications.</p>\",\"PeriodicalId\":46871,\"journal\":{\"name\":\"Personality and Mental Health\",\"volume\":\"19 2\",\"pages\":\"e70018\"},\"PeriodicalIF\":2.0000,\"publicationDate\":\"2025-05-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Personality and Mental Health\",\"FirstCategoryId\":\"3\",\"ListUrlMain\":\"https://doi.org/10.1002/pmh.70018\",\"RegionNum\":3,\"RegionCategory\":\"医学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"PSYCHIATRY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Personality and Mental Health","FirstCategoryId":"3","ListUrlMain":"https://doi.org/10.1002/pmh.70018","RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"PSYCHIATRY","Score":null,"Total":0}
Unveiling Public Stigma for Borderline Personality Disorder: A Comparative Study of Artificial Intelligence and Mental Health Care Providers.
Generative artificial intelligence (GAI) programs can identify symptoms and make recommendations for treatment for mental disorders, including borderline personality disorder (BPD). Despite GAI's potential as a clinical tool, stereotypes are inherent in their algorithms but not obvious until directly assessed. Given this concern, we assessed and compared GAIs' (ChatGPT-3.5, 4, and Google Gemini) symptom recognition and public stigma for a woman and man vignette character with BPD. The GAIs' responses were also compared to a sample of mental health care practitioners (MHCPs; n = 218). Compared to MHCPs, GAI showed more empathy for the characters. GAI were also less likely to view the characters' mental health symptoms as developmental stage problems and rated these symptoms as more chronic and unchangeable. The GAI also rated the vignette characters as less trustworthy and more likely to have difficulty forming close relationships than the MHCPs. Across GAI, gender biases were found with Google Gemini showing less empathy, more negative reactions, and greater public stigma, particularly for a woman with BPD, than either ChatGPT-3.5 or ChatGPT-4. A woman with BPD was also rated as having more chronic mental health problems than a man by all GAI. Overall, these results suggest that GAI may express empathy but reflects gender bias and stereotyped beliefs for people with BPD. Greater transparency and incorporation of knowledgeable MHCPs and people with lived experiences are needed in GAI training to reduce bias and enhance their accuracy prior to use in mental health applications.
期刊介绍:
Personality and Mental Health: Multidisciplinary Studies from Personality Dysfunction to Criminal Behaviour aims to lead and shape the international field in this rapidly expanding area, uniting three distinct literatures: DSM-IV/ICD-10 defined personality disorders, psychopathy and offending behaviour. Through its multi-disciplinary and service orientated approach, Personality and Mental Health provides a peer-reviewed, authoritative resource for researchers, practitioners and policy makers working in the areas of personality and mental health.