{"title":"人工智能辅助疼痛评估需要认识上的谦逊。","authors":"Rachel A Katz, S Scott Graham, Daniel Z Buchman","doi":"10.1007/s11019-025-10264-9","DOIUrl":null,"url":null,"abstract":"<p><p>It has been difficult historically for physicians, patients, and philosophers alike to quantify pain given that pain is commonly understood as an individual and subjective experience. The process of measuring and diagnosing pain is often a fraught and complicated process. New developments in diagnostic technologies assisted by artificial intelligence promise more accurate and efficient diagnosis for patients, but these tools are known to reproduce and further entrench existing issues within the healthcare system, such as poor patient treatment and the replication of systemic biases. In this paper we present the argument that there are several ethical-epistemic issues with the potential implementation of these technologies in pain management settings. We draw on literature about self-trust and epistemic and testimonial injustice to make these claims. We conclude with a proposal that the adoption of epistemic humility on the part of both AI tool developers and clinicians can contribute to a climate of trust in and beyond the pain management context and lead to a more just approach to the implementation of AI in pain diagnosis and management.</p>","PeriodicalId":47449,"journal":{"name":"Medicine Health Care and Philosophy","volume":" ","pages":""},"PeriodicalIF":2.3000,"publicationDate":"2025-03-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"The need for epistemic humility in AI-assisted pain assessment.\",\"authors\":\"Rachel A Katz, S Scott Graham, Daniel Z Buchman\",\"doi\":\"10.1007/s11019-025-10264-9\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>It has been difficult historically for physicians, patients, and philosophers alike to quantify pain given that pain is commonly understood as an individual and subjective experience. The process of measuring and diagnosing pain is often a fraught and complicated process. New developments in diagnostic technologies assisted by artificial intelligence promise more accurate and efficient diagnosis for patients, but these tools are known to reproduce and further entrench existing issues within the healthcare system, such as poor patient treatment and the replication of systemic biases. In this paper we present the argument that there are several ethical-epistemic issues with the potential implementation of these technologies in pain management settings. We draw on literature about self-trust and epistemic and testimonial injustice to make these claims. We conclude with a proposal that the adoption of epistemic humility on the part of both AI tool developers and clinicians can contribute to a climate of trust in and beyond the pain management context and lead to a more just approach to the implementation of AI in pain diagnosis and management.</p>\",\"PeriodicalId\":47449,\"journal\":{\"name\":\"Medicine Health Care and Philosophy\",\"volume\":\" \",\"pages\":\"\"},\"PeriodicalIF\":2.3000,\"publicationDate\":\"2025-03-15\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Medicine Health Care and Philosophy\",\"FirstCategoryId\":\"98\",\"ListUrlMain\":\"https://doi.org/10.1007/s11019-025-10264-9\",\"RegionNum\":2,\"RegionCategory\":\"哲学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"ETHICS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Medicine Health Care and Philosophy","FirstCategoryId":"98","ListUrlMain":"https://doi.org/10.1007/s11019-025-10264-9","RegionNum":2,"RegionCategory":"哲学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ETHICS","Score":null,"Total":0}
The need for epistemic humility in AI-assisted pain assessment.
It has been difficult historically for physicians, patients, and philosophers alike to quantify pain given that pain is commonly understood as an individual and subjective experience. The process of measuring and diagnosing pain is often a fraught and complicated process. New developments in diagnostic technologies assisted by artificial intelligence promise more accurate and efficient diagnosis for patients, but these tools are known to reproduce and further entrench existing issues within the healthcare system, such as poor patient treatment and the replication of systemic biases. In this paper we present the argument that there are several ethical-epistemic issues with the potential implementation of these technologies in pain management settings. We draw on literature about self-trust and epistemic and testimonial injustice to make these claims. We conclude with a proposal that the adoption of epistemic humility on the part of both AI tool developers and clinicians can contribute to a climate of trust in and beyond the pain management context and lead to a more just approach to the implementation of AI in pain diagnosis and management.
期刊介绍:
Medicine, Health Care and Philosophy: A European Journal is the official journal of the European Society for Philosophy of Medicine and Health Care. It provides a forum for international exchange of research data, theories, reports and opinions in bioethics and philosophy of medicine. The journal promotes interdisciplinary studies, and stimulates philosophical analysis centered on a common object of reflection: health care, the human effort to deal with disease, illness, death as well as health, well-being and life. Particular attention is paid to developing contributions from all European countries, and to making accessible scientific work and reports on the practice of health care ethics, from all nations, cultures and language areas in Europe.