评估人工智能和传统学习工具用于胸部x射线解释:一项描述性研究

IF 1.2 Q4 MEDICINE, RESEARCH & EXPERIMENTAL
Clinical Teacher Pub Date : 2025-07-12 DOI:10.1111/tct.70139
Gurtek Singh Samra, Vashisht Ramoutar, Kelley Chen, Muiz Chaudhry, Hrithika Patel, Terese Bird, Vanessa Rodwell
{"title":"评估人工智能和传统学习工具用于胸部x射线解释:一项描述性研究","authors":"Gurtek Singh Samra,&nbsp;Vashisht Ramoutar,&nbsp;Kelley Chen,&nbsp;Muiz Chaudhry,&nbsp;Hrithika Patel,&nbsp;Terese Bird,&nbsp;Vanessa Rodwell","doi":"10.1111/tct.70139","DOIUrl":null,"url":null,"abstract":"<div>\n \n \n <section>\n \n <h3> Background</h3>\n \n <p>Chest X-ray (CXR) interpretation is a fundamental yet challenging skill for medical students to master. Traditional resources like Radiopaedia offer extensive content, while newer artificial intelligence (AI) tools, such as Chester, provide pattern recognition and real-time feedback. This study aims to evaluate Radiopaedia and Chester's effectiveness as educational tools and to explore student perspectives on AI.</p>\n </section>\n \n <section>\n \n <h3> Approach</h3>\n \n <p>A teaching session on CXR interpretation fundamentals was delivered to establish a standardised baseline of knowledge among participants, followed by a live tutorial introducing students to the functionality of both Chester AI and Radiopaedia. Students engaged with both tools to answer a 25-item workbook assessing complex CXR pathologies. CXRs were deliberately selected for their complexity to examine student engagement with online learning tools amid diagnostic uncertainty, encouraging applied clinical reasoning.</p>\n </section>\n \n <section>\n \n <h3> Evaluation</h3>\n \n <p>Preclinical medical students were recruited and randomly assigned to the Chester AI (<i>n</i> = 5) or Radiopaedia group (<i>n</i> = 5). During the workbook task, participants were instructed to engage with the workbook using Radiopaedia and Chester AI. Post-session, participants took part in focus groups to share their experiences. Thematic analysis highlighted Chester's efficiency and potential as a revision tool but noted limitations with complex CXR pathologies. Radiopaedia was valued for its comprehensiveness but was less efficient for the workbook task due to its vast array of content.</p>\n </section>\n \n <section>\n \n <h3> Implications</h3>\n \n <p>AI tools such as Chester show promise as complementary resources alongside traditional learning materials. Combining Chester's efficiency and real-time feedback with Radiopaedia's in-depth content may optimise learning and improve CXR interpretation skills.</p>\n </section>\n </div>","PeriodicalId":47324,"journal":{"name":"Clinical Teacher","volume":"22 4","pages":""},"PeriodicalIF":1.2000,"publicationDate":"2025-07-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1111/tct.70139","citationCount":"0","resultStr":"{\"title\":\"Evaluating Artificial Intelligence and Traditional Learning Tools for Chest X-Ray Interpretation: A Descriptive Study\",\"authors\":\"Gurtek Singh Samra,&nbsp;Vashisht Ramoutar,&nbsp;Kelley Chen,&nbsp;Muiz Chaudhry,&nbsp;Hrithika Patel,&nbsp;Terese Bird,&nbsp;Vanessa Rodwell\",\"doi\":\"10.1111/tct.70139\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div>\\n \\n \\n <section>\\n \\n <h3> Background</h3>\\n \\n <p>Chest X-ray (CXR) interpretation is a fundamental yet challenging skill for medical students to master. Traditional resources like Radiopaedia offer extensive content, while newer artificial intelligence (AI) tools, such as Chester, provide pattern recognition and real-time feedback. This study aims to evaluate Radiopaedia and Chester's effectiveness as educational tools and to explore student perspectives on AI.</p>\\n </section>\\n \\n <section>\\n \\n <h3> Approach</h3>\\n \\n <p>A teaching session on CXR interpretation fundamentals was delivered to establish a standardised baseline of knowledge among participants, followed by a live tutorial introducing students to the functionality of both Chester AI and Radiopaedia. Students engaged with both tools to answer a 25-item workbook assessing complex CXR pathologies. CXRs were deliberately selected for their complexity to examine student engagement with online learning tools amid diagnostic uncertainty, encouraging applied clinical reasoning.</p>\\n </section>\\n \\n <section>\\n \\n <h3> Evaluation</h3>\\n \\n <p>Preclinical medical students were recruited and randomly assigned to the Chester AI (<i>n</i> = 5) or Radiopaedia group (<i>n</i> = 5). During the workbook task, participants were instructed to engage with the workbook using Radiopaedia and Chester AI. Post-session, participants took part in focus groups to share their experiences. Thematic analysis highlighted Chester's efficiency and potential as a revision tool but noted limitations with complex CXR pathologies. Radiopaedia was valued for its comprehensiveness but was less efficient for the workbook task due to its vast array of content.</p>\\n </section>\\n \\n <section>\\n \\n <h3> Implications</h3>\\n \\n <p>AI tools such as Chester show promise as complementary resources alongside traditional learning materials. Combining Chester's efficiency and real-time feedback with Radiopaedia's in-depth content may optimise learning and improve CXR interpretation skills.</p>\\n </section>\\n </div>\",\"PeriodicalId\":47324,\"journal\":{\"name\":\"Clinical Teacher\",\"volume\":\"22 4\",\"pages\":\"\"},\"PeriodicalIF\":1.2000,\"publicationDate\":\"2025-07-12\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://onlinelibrary.wiley.com/doi/epdf/10.1111/tct.70139\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Clinical Teacher\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://asmepublications.onlinelibrary.wiley.com/doi/10.1111/tct.70139\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q4\",\"JCRName\":\"MEDICINE, RESEARCH & EXPERIMENTAL\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Clinical Teacher","FirstCategoryId":"1085","ListUrlMain":"https://asmepublications.onlinelibrary.wiley.com/doi/10.1111/tct.70139","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"MEDICINE, RESEARCH & EXPERIMENTAL","Score":null,"Total":0}
引用次数: 0

摘要

背景胸部x光片(CXR)解读是医学生掌握的一项基本但具有挑战性的技能。像Radiopaedia这样的传统资源提供广泛的内容,而新的人工智能(AI)工具,如Chester,提供模式识别和实时反馈。本研究旨在评估放射百科和切斯特作为教育工具的有效性,并探索学生对人工智能的看法。通过一节关于CXR解释基础知识的教学课程,在参与者之间建立标准化的知识基线,随后是一节向学生介绍Chester AI和Radiopaedia功能的现场教程。学生们使用这两种工具来回答25项评估复杂CXR病理的工作手册。为了在诊断不确定的情况下检查学生对在线学习工具的参与程度,我们故意选择了复杂的cxr,鼓励应用临床推理。招募临床前医学院学生,随机分为Chester AI组(n = 5)和Radiopaedia组(n = 5)。在工作簿任务期间,参与者被指示使用Radiopaedia和Chester AI参与工作簿。会议结束后,与会者参加了焦点小组,分享他们的经验。专题分析强调了Chester作为修正工具的效率和潜力,但指出了复杂CXR病理的局限性。Radiopaedia因其全面性而受到重视,但由于其内容繁多,工作簿任务效率较低。Chester等人工智能工具有望成为传统学习材料的补充资源。将Chester的效率和实时反馈与Radiopaedia的深度内容相结合,可以优化学习并提高CXR的解释技能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。

Evaluating Artificial Intelligence and Traditional Learning Tools for Chest X-Ray Interpretation: A Descriptive Study

Evaluating Artificial Intelligence and Traditional Learning Tools for Chest X-Ray Interpretation: A Descriptive Study

Evaluating Artificial Intelligence and Traditional Learning Tools for Chest X-Ray Interpretation: A Descriptive Study

Evaluating Artificial Intelligence and Traditional Learning Tools for Chest X-Ray Interpretation: A Descriptive Study

Evaluating Artificial Intelligence and Traditional Learning Tools for Chest X-Ray Interpretation: A Descriptive Study

Background

Chest X-ray (CXR) interpretation is a fundamental yet challenging skill for medical students to master. Traditional resources like Radiopaedia offer extensive content, while newer artificial intelligence (AI) tools, such as Chester, provide pattern recognition and real-time feedback. This study aims to evaluate Radiopaedia and Chester's effectiveness as educational tools and to explore student perspectives on AI.

Approach

A teaching session on CXR interpretation fundamentals was delivered to establish a standardised baseline of knowledge among participants, followed by a live tutorial introducing students to the functionality of both Chester AI and Radiopaedia. Students engaged with both tools to answer a 25-item workbook assessing complex CXR pathologies. CXRs were deliberately selected for their complexity to examine student engagement with online learning tools amid diagnostic uncertainty, encouraging applied clinical reasoning.

Evaluation

Preclinical medical students were recruited and randomly assigned to the Chester AI (n = 5) or Radiopaedia group (n = 5). During the workbook task, participants were instructed to engage with the workbook using Radiopaedia and Chester AI. Post-session, participants took part in focus groups to share their experiences. Thematic analysis highlighted Chester's efficiency and potential as a revision tool but noted limitations with complex CXR pathologies. Radiopaedia was valued for its comprehensiveness but was less efficient for the workbook task due to its vast array of content.

Implications

AI tools such as Chester show promise as complementary resources alongside traditional learning materials. Combining Chester's efficiency and real-time feedback with Radiopaedia's in-depth content may optimise learning and improve CXR interpretation skills.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Clinical Teacher
Clinical Teacher MEDICINE, RESEARCH & EXPERIMENTAL-
CiteScore
2.90
自引率
5.60%
发文量
113
期刊介绍: The Clinical Teacher has been designed with the active, practising clinician in mind. It aims to provide a digest of current research, practice and thinking in medical education presented in a readable, stimulating and practical style. The journal includes sections for reviews of the literature relating to clinical teaching bringing authoritative views on the latest thinking about modern teaching. There are also sections on specific teaching approaches, a digest of the latest research published in Medical Education and other teaching journals, reports of initiatives and advances in thinking and practical teaching from around the world, and expert community and discussion on challenging and controversial issues in today"s clinical education.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信