{"title":"改进的活动评估:使用Rasch模型的迭代过程。","authors":"Anthony Gage, Sarah A Nisly","doi":"10.1097/CEH.0000000000000620","DOIUrl":null,"url":null,"abstract":"<p><strong>Introduction: </strong>Traditional evaluation models, often linear and outcome-focused, are increasingly inadequate for the complexities of modern medical education, which demands more comprehensive and nuanced assessment approaches.</p><p><strong>Methods: </strong>A standardized continuing professional development activity evaluation instrument was developed and implemented. An iterative process was performed, using a repeat Rasch analysis, to improve reliability of the evaluation instrument. Category Probability Curves and Test Information Function were generated by the Rasch analysis to refine the construction of the assessment. All educational activities completed between 2022 and 2024 were eligible for inclusion. The study incorporated a diverse range of educational activities and included multiple health care professions.</p><p><strong>Results: </strong>The pilot analysis included 250 educational activities with 26,554 individual learners completing evaluations for analysis. Initial Rasch findings demonstrated a need to remove redundancies and change from a five to four-point rating scale. The final instrument validation included 21 activities and 529 learners. Improvement was seen in reliability after modifications, with an increase in Cronbach alpha from 0.72 to 0.80.</p><p><strong>Discussion: </strong>Use of psychometrics to improve assessments can yield a more reliable and less redundant evaluation instrument. This research demonstrates a psychometrically informed, flexible evaluation tool that can inform future educational efforts and serve as a data driven metric to enhance the quality of continuing professional development programs.</p>","PeriodicalId":50218,"journal":{"name":"Journal of Continuing Education in the Health Professions","volume":" ","pages":""},"PeriodicalIF":1.7000,"publicationDate":"2025-10-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Improved Activity Evaluations: An Iterative Process Using the Rasch Model.\",\"authors\":\"Anthony Gage, Sarah A Nisly\",\"doi\":\"10.1097/CEH.0000000000000620\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><strong>Introduction: </strong>Traditional evaluation models, often linear and outcome-focused, are increasingly inadequate for the complexities of modern medical education, which demands more comprehensive and nuanced assessment approaches.</p><p><strong>Methods: </strong>A standardized continuing professional development activity evaluation instrument was developed and implemented. An iterative process was performed, using a repeat Rasch analysis, to improve reliability of the evaluation instrument. Category Probability Curves and Test Information Function were generated by the Rasch analysis to refine the construction of the assessment. All educational activities completed between 2022 and 2024 were eligible for inclusion. The study incorporated a diverse range of educational activities and included multiple health care professions.</p><p><strong>Results: </strong>The pilot analysis included 250 educational activities with 26,554 individual learners completing evaluations for analysis. Initial Rasch findings demonstrated a need to remove redundancies and change from a five to four-point rating scale. The final instrument validation included 21 activities and 529 learners. Improvement was seen in reliability after modifications, with an increase in Cronbach alpha from 0.72 to 0.80.</p><p><strong>Discussion: </strong>Use of psychometrics to improve assessments can yield a more reliable and less redundant evaluation instrument. This research demonstrates a psychometrically informed, flexible evaluation tool that can inform future educational efforts and serve as a data driven metric to enhance the quality of continuing professional development programs.</p>\",\"PeriodicalId\":50218,\"journal\":{\"name\":\"Journal of Continuing Education in the Health Professions\",\"volume\":\" \",\"pages\":\"\"},\"PeriodicalIF\":1.7000,\"publicationDate\":\"2025-10-14\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Continuing Education in the Health Professions\",\"FirstCategoryId\":\"95\",\"ListUrlMain\":\"https://doi.org/10.1097/CEH.0000000000000620\",\"RegionNum\":4,\"RegionCategory\":\"教育学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"EDUCATION, SCIENTIFIC DISCIPLINES\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Continuing Education in the Health Professions","FirstCategoryId":"95","ListUrlMain":"https://doi.org/10.1097/CEH.0000000000000620","RegionNum":4,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"EDUCATION, SCIENTIFIC DISCIPLINES","Score":null,"Total":0}
Improved Activity Evaluations: An Iterative Process Using the Rasch Model.
Introduction: Traditional evaluation models, often linear and outcome-focused, are increasingly inadequate for the complexities of modern medical education, which demands more comprehensive and nuanced assessment approaches.
Methods: A standardized continuing professional development activity evaluation instrument was developed and implemented. An iterative process was performed, using a repeat Rasch analysis, to improve reliability of the evaluation instrument. Category Probability Curves and Test Information Function were generated by the Rasch analysis to refine the construction of the assessment. All educational activities completed between 2022 and 2024 were eligible for inclusion. The study incorporated a diverse range of educational activities and included multiple health care professions.
Results: The pilot analysis included 250 educational activities with 26,554 individual learners completing evaluations for analysis. Initial Rasch findings demonstrated a need to remove redundancies and change from a five to four-point rating scale. The final instrument validation included 21 activities and 529 learners. Improvement was seen in reliability after modifications, with an increase in Cronbach alpha from 0.72 to 0.80.
Discussion: Use of psychometrics to improve assessments can yield a more reliable and less redundant evaluation instrument. This research demonstrates a psychometrically informed, flexible evaluation tool that can inform future educational efforts and serve as a data driven metric to enhance the quality of continuing professional development programs.
期刊介绍:
The Journal of Continuing Education is a quarterly journal publishing articles relevant to theory, practice, and policy development for continuing education in the health sciences. The journal presents original research and essays on subjects involving the lifelong learning of professionals, with a focus on continuous quality improvement, competency assessment, and knowledge translation. It provides thoughtful advice to those who develop, conduct, and evaluate continuing education programs.