改进的活动评估:使用Rasch模型的迭代过程。

IF 1.7 4区 教育学 Q2 EDUCATION, SCIENTIFIC DISCIPLINES
Anthony Gage, Sarah A Nisly
{"title":"改进的活动评估:使用Rasch模型的迭代过程。","authors":"Anthony Gage, Sarah A Nisly","doi":"10.1097/CEH.0000000000000620","DOIUrl":null,"url":null,"abstract":"<p><strong>Introduction: </strong>Traditional evaluation models, often linear and outcome-focused, are increasingly inadequate for the complexities of modern medical education, which demands more comprehensive and nuanced assessment approaches.</p><p><strong>Methods: </strong>A standardized continuing professional development activity evaluation instrument was developed and implemented. An iterative process was performed, using a repeat Rasch analysis, to improve reliability of the evaluation instrument. Category Probability Curves and Test Information Function were generated by the Rasch analysis to refine the construction of the assessment. All educational activities completed between 2022 and 2024 were eligible for inclusion. The study incorporated a diverse range of educational activities and included multiple health care professions.</p><p><strong>Results: </strong>The pilot analysis included 250 educational activities with 26,554 individual learners completing evaluations for analysis. Initial Rasch findings demonstrated a need to remove redundancies and change from a five to four-point rating scale. The final instrument validation included 21 activities and 529 learners. Improvement was seen in reliability after modifications, with an increase in Cronbach alpha from 0.72 to 0.80.</p><p><strong>Discussion: </strong>Use of psychometrics to improve assessments can yield a more reliable and less redundant evaluation instrument. This research demonstrates a psychometrically informed, flexible evaluation tool that can inform future educational efforts and serve as a data driven metric to enhance the quality of continuing professional development programs.</p>","PeriodicalId":50218,"journal":{"name":"Journal of Continuing Education in the Health Professions","volume":" ","pages":""},"PeriodicalIF":1.7000,"publicationDate":"2025-10-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Improved Activity Evaluations: An Iterative Process Using the Rasch Model.\",\"authors\":\"Anthony Gage, Sarah A Nisly\",\"doi\":\"10.1097/CEH.0000000000000620\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><strong>Introduction: </strong>Traditional evaluation models, often linear and outcome-focused, are increasingly inadequate for the complexities of modern medical education, which demands more comprehensive and nuanced assessment approaches.</p><p><strong>Methods: </strong>A standardized continuing professional development activity evaluation instrument was developed and implemented. An iterative process was performed, using a repeat Rasch analysis, to improve reliability of the evaluation instrument. Category Probability Curves and Test Information Function were generated by the Rasch analysis to refine the construction of the assessment. All educational activities completed between 2022 and 2024 were eligible for inclusion. The study incorporated a diverse range of educational activities and included multiple health care professions.</p><p><strong>Results: </strong>The pilot analysis included 250 educational activities with 26,554 individual learners completing evaluations for analysis. Initial Rasch findings demonstrated a need to remove redundancies and change from a five to four-point rating scale. The final instrument validation included 21 activities and 529 learners. Improvement was seen in reliability after modifications, with an increase in Cronbach alpha from 0.72 to 0.80.</p><p><strong>Discussion: </strong>Use of psychometrics to improve assessments can yield a more reliable and less redundant evaluation instrument. This research demonstrates a psychometrically informed, flexible evaluation tool that can inform future educational efforts and serve as a data driven metric to enhance the quality of continuing professional development programs.</p>\",\"PeriodicalId\":50218,\"journal\":{\"name\":\"Journal of Continuing Education in the Health Professions\",\"volume\":\" \",\"pages\":\"\"},\"PeriodicalIF\":1.7000,\"publicationDate\":\"2025-10-14\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Continuing Education in the Health Professions\",\"FirstCategoryId\":\"95\",\"ListUrlMain\":\"https://doi.org/10.1097/CEH.0000000000000620\",\"RegionNum\":4,\"RegionCategory\":\"教育学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"EDUCATION, SCIENTIFIC DISCIPLINES\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Continuing Education in the Health Professions","FirstCategoryId":"95","ListUrlMain":"https://doi.org/10.1097/CEH.0000000000000620","RegionNum":4,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"EDUCATION, SCIENTIFIC DISCIPLINES","Score":null,"Total":0}
引用次数: 0

摘要

传统的评估模型,通常是线性的和以结果为中心的,越来越不适合现代医学教育的复杂性,这需要更全面和细致的评估方法。方法:制定并实施标准化的持续专业发展活动评价工具。为了提高评价仪器的可靠性,采用重复的Rasch分析进行了迭代过程。通过Rasch分析生成类别概率曲线和测试信息函数,以细化评估的构建。所有在2022年至2024年间完成的教育活动都有资格纳入。这项研究纳入了各种各样的教育活动,包括多个保健专业。结果:试点分析包括250项教育活动,26,554名个体学习者完成评估分析。最初的Rasch调查结果表明,有必要消除冗余,并将评分从5分改为4分。最终的仪器验证包括21项活动和529名学习者。改进后的信度得到改善,Cronbach alpha从0.72增加到0.80。讨论:使用心理测量学来改进评估可以产生更可靠和更少冗余的评估工具。本研究展示了一种心理测量学知识,灵活的评估工具,可以为未来的教育工作提供信息,并作为数据驱动的度量标准,以提高持续专业发展计划的质量。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Improved Activity Evaluations: An Iterative Process Using the Rasch Model.

Introduction: Traditional evaluation models, often linear and outcome-focused, are increasingly inadequate for the complexities of modern medical education, which demands more comprehensive and nuanced assessment approaches.

Methods: A standardized continuing professional development activity evaluation instrument was developed and implemented. An iterative process was performed, using a repeat Rasch analysis, to improve reliability of the evaluation instrument. Category Probability Curves and Test Information Function were generated by the Rasch analysis to refine the construction of the assessment. All educational activities completed between 2022 and 2024 were eligible for inclusion. The study incorporated a diverse range of educational activities and included multiple health care professions.

Results: The pilot analysis included 250 educational activities with 26,554 individual learners completing evaluations for analysis. Initial Rasch findings demonstrated a need to remove redundancies and change from a five to four-point rating scale. The final instrument validation included 21 activities and 529 learners. Improvement was seen in reliability after modifications, with an increase in Cronbach alpha from 0.72 to 0.80.

Discussion: Use of psychometrics to improve assessments can yield a more reliable and less redundant evaluation instrument. This research demonstrates a psychometrically informed, flexible evaluation tool that can inform future educational efforts and serve as a data driven metric to enhance the quality of continuing professional development programs.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
3.00
自引率
16.70%
发文量
85
审稿时长
>12 weeks
期刊介绍: The Journal of Continuing Education is a quarterly journal publishing articles relevant to theory, practice, and policy development for continuing education in the health sciences. The journal presents original research and essays on subjects involving the lifelong learning of professionals, with a focus on continuous quality improvement, competency assessment, and knowledge translation. It provides thoughtful advice to those who develop, conduct, and evaluate continuing education programs.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信