学生对自定义人工智能临床病例伴侣的看法。

Q2 Health Professions
Andrew P Chastain, Chris Roman, Kevin M Bogenschutz
{"title":"学生对自定义人工智能临床病例伴侣的看法。","authors":"Andrew P Chastain, Chris Roman, Kevin M Bogenschutz","doi":"10.1097/JPA.0000000000000697","DOIUrl":null,"url":null,"abstract":"<p><strong>Introduction: </strong>Artificial intelligence tools show promise in supplementing traditional physician assistant education, particularly in developing clinical reasoning skills. However, limited research exists on custom Generative Pretrained Transformer (GPT) applications in physician assistant (PA) education. This study evaluated student experiences and perceptions of a custom GPT-based clinical reasoning tool.</p><p><strong>Methods: </strong>A mixed-methods study was conducted with first-year PA students (n = 72) at Butler University in April 2025. Students engaged in a custom GPT-4-Turbo Butler University PA Clinical Case Companion, designed to deliver interactive clinical cases using Socratic dialogue across hematology and nephrology specialties. Postengagement surveys assessed students' perceptions of the tool's helpfulness, cognitive challenge, and clinical skill development across 7 clinical domains. Surveys used Likert scales and open-ended questions. Researchers applied descriptive analysis on quantitative data and thematic analysis to qualitative responses.</p><p><strong>Results: </strong>Fifty-nine students completed surveys (81.9% response rate). Ninety percent of respondents rated the tool moderately to extremely helpful. Students reported significant improvement in developing differential diagnoses (79.7%), ordering diagnostic studies (81.4%), and interpreting tests (78.0%). Qualitative analysis revealed 3 primary themes: appreciation for immediate feedback (36%), detailed real-time explanations (36%), and receiving GPT-initiated student prompting (12%). Students suggested earlier curriculum integration and expressed concerns about the accuracy of case content.</p><p><strong>Discussion: </strong>Custom GPT-based clinical reasoning tools can serve as an adjunct to traditional PA educational methods by offering personalized, on-demand learning experiences. Students perceived substantial benefits in developing clinical reasoning skills, but noted limitations in history-taking skills. Implementation should include faculty oversight and artificial intelligence literacy training to address accuracy concerns while maximizing educational benefits.</p>","PeriodicalId":39231,"journal":{"name":"Journal of Physician Assistant Education","volume":" ","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2025-09-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Student Perceptions of a Custom Artificial Intelligence Clinical Case Companion.\",\"authors\":\"Andrew P Chastain, Chris Roman, Kevin M Bogenschutz\",\"doi\":\"10.1097/JPA.0000000000000697\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><strong>Introduction: </strong>Artificial intelligence tools show promise in supplementing traditional physician assistant education, particularly in developing clinical reasoning skills. However, limited research exists on custom Generative Pretrained Transformer (GPT) applications in physician assistant (PA) education. This study evaluated student experiences and perceptions of a custom GPT-based clinical reasoning tool.</p><p><strong>Methods: </strong>A mixed-methods study was conducted with first-year PA students (n = 72) at Butler University in April 2025. Students engaged in a custom GPT-4-Turbo Butler University PA Clinical Case Companion, designed to deliver interactive clinical cases using Socratic dialogue across hematology and nephrology specialties. Postengagement surveys assessed students' perceptions of the tool's helpfulness, cognitive challenge, and clinical skill development across 7 clinical domains. Surveys used Likert scales and open-ended questions. Researchers applied descriptive analysis on quantitative data and thematic analysis to qualitative responses.</p><p><strong>Results: </strong>Fifty-nine students completed surveys (81.9% response rate). Ninety percent of respondents rated the tool moderately to extremely helpful. Students reported significant improvement in developing differential diagnoses (79.7%), ordering diagnostic studies (81.4%), and interpreting tests (78.0%). Qualitative analysis revealed 3 primary themes: appreciation for immediate feedback (36%), detailed real-time explanations (36%), and receiving GPT-initiated student prompting (12%). Students suggested earlier curriculum integration and expressed concerns about the accuracy of case content.</p><p><strong>Discussion: </strong>Custom GPT-based clinical reasoning tools can serve as an adjunct to traditional PA educational methods by offering personalized, on-demand learning experiences. Students perceived substantial benefits in developing clinical reasoning skills, but noted limitations in history-taking skills. Implementation should include faculty oversight and artificial intelligence literacy training to address accuracy concerns while maximizing educational benefits.</p>\",\"PeriodicalId\":39231,\"journal\":{\"name\":\"Journal of Physician Assistant Education\",\"volume\":\" \",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2025-09-10\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Physician Assistant Education\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1097/JPA.0000000000000697\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"Health Professions\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Physician Assistant Education","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1097/JPA.0000000000000697","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"Health Professions","Score":null,"Total":0}
引用次数: 0

摘要

导读:人工智能工具在补充传统的医师助理教育方面表现出了希望,特别是在发展临床推理技能方面。然而,自定义生成预训练转换器(GPT)在医师助理(PA)教育中的应用研究有限。本研究评估了学生对基于gpt的临床推理工具的体验和看法。方法:一项混合方法研究于2025年4月在巴特勒大学的一年级PA学生(n = 72)中进行。学生们参与了定制的GPT-4-Turbo巴特勒大学临床病例伴侣,旨在通过血液学和肾脏病学专业的苏格拉底对话提供交互式临床病例。参与后的调查评估了学生对该工具在7个临床领域的有用性、认知挑战和临床技能发展的看法。调查采用李克特量表和开放式问题。研究人员对定量数据采用描述性分析,对定性反应采用专题分析。结果:59名学生完成问卷调查,回复率81.9%。90%的受访者认为该工具有中等到极好的帮助。学生报告在鉴别诊断(79.7%)、安排诊断研究(81.4%)和解释测试(78.0%)方面有显著改善。定性分析揭示了3个主要主题:欣赏即时反馈(36%),详细的实时解释(36%),以及接受gpt发起的学生提示(12%)。学生们建议尽早整合课程,并对案例内容的准确性表示关注。讨论:定制的基于gpt的临床推理工具可以作为传统PA教育方法的辅助手段,提供个性化的、按需的学习体验。学生们意识到临床推理技能的发展有很大的好处,但注意到历史记录技能的局限性。实施应包括教师监督和人工智能素养培训,以解决准确性问题,同时最大限度地提高教育效益。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Student Perceptions of a Custom Artificial Intelligence Clinical Case Companion.

Introduction: Artificial intelligence tools show promise in supplementing traditional physician assistant education, particularly in developing clinical reasoning skills. However, limited research exists on custom Generative Pretrained Transformer (GPT) applications in physician assistant (PA) education. This study evaluated student experiences and perceptions of a custom GPT-based clinical reasoning tool.

Methods: A mixed-methods study was conducted with first-year PA students (n = 72) at Butler University in April 2025. Students engaged in a custom GPT-4-Turbo Butler University PA Clinical Case Companion, designed to deliver interactive clinical cases using Socratic dialogue across hematology and nephrology specialties. Postengagement surveys assessed students' perceptions of the tool's helpfulness, cognitive challenge, and clinical skill development across 7 clinical domains. Surveys used Likert scales and open-ended questions. Researchers applied descriptive analysis on quantitative data and thematic analysis to qualitative responses.

Results: Fifty-nine students completed surveys (81.9% response rate). Ninety percent of respondents rated the tool moderately to extremely helpful. Students reported significant improvement in developing differential diagnoses (79.7%), ordering diagnostic studies (81.4%), and interpreting tests (78.0%). Qualitative analysis revealed 3 primary themes: appreciation for immediate feedback (36%), detailed real-time explanations (36%), and receiving GPT-initiated student prompting (12%). Students suggested earlier curriculum integration and expressed concerns about the accuracy of case content.

Discussion: Custom GPT-based clinical reasoning tools can serve as an adjunct to traditional PA educational methods by offering personalized, on-demand learning experiences. Students perceived substantial benefits in developing clinical reasoning skills, but noted limitations in history-taking skills. Implementation should include faculty oversight and artificial intelligence literacy training to address accuracy concerns while maximizing educational benefits.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
1.00
自引率
0.00%
发文量
109
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信