Effective use of Item Analysis to improve the Reliability and Validity of Undergraduate Medical Examinations: Evaluating the same exam over many years: a different approach.

IF 1.2 4区 医学 Q2 MEDICINE, GENERAL & INTERNAL
Nadeem Alam Zubairi, Turki Saad AlAhmadi, Mohamed Hesham Ibrahim, Moustafa Abdelaal Hegazi, Fahad Ussif Gadi
{"title":"Effective use of Item Analysis to improve the Reliability and Validity of Undergraduate Medical Examinations: Evaluating the same exam over many years: a different approach.","authors":"Nadeem Alam Zubairi, Turki Saad AlAhmadi, Mohamed Hesham Ibrahim, Moustafa Abdelaal Hegazi, Fahad Ussif Gadi","doi":"10.12669/pjms.41.3.10693","DOIUrl":null,"url":null,"abstract":"<p><strong>Objective: </strong>MCQ exams are part of end-module assessments in undergraduate medical institutions. Item Analysis (IA) is the best tool to check their reliability and validity. It provides the Reliability Coefficient KR20, Difficulty Index (DI), Discrimination Index (DISC), and Distractor Efficiency (DE). Almost all research papers on IA are based on single exam analysis. We examined the IA of multiple exams of the same module, taken in four years. We aimed to explore the required consistency over the years and the effectiveness of IA-based post-exam measures.</p><p><strong>Methodology: </strong>Item Analysis of eight final MCQ exams of the Pediatric module from 2020-21 to 2023-24, at the Faculty of Medicine in Rabigh, King Abdulaziz University, Saudi Arabia, were included in the study.</p><p><strong>Results: </strong>All exams had KR20 of 90 and above indicating excellent reliability. Difficulty levels were consistent except for a single year. Discriminative ability was maintained over the years. Only 28 out of 800 MCQs had a negative DISC. All exams maintained good DE. Only 15 MCQs over four years had zero DE. The practice of reviewing all Non-Functional Distractors yielded a gradual improvement in exam quality.</p><p><strong>Conclusion: </strong>Besides the IA of individual exams, it is also recommended that IA of the same exam be evaluated over 4-5 years to see consistency and trends towards improvement. It helps in improving the reliability and validity by addressing deficiencies and deviations from the recommended standards.</p>","PeriodicalId":19958,"journal":{"name":"Pakistan Journal of Medical Sciences","volume":"41 3","pages":"810-815"},"PeriodicalIF":1.2000,"publicationDate":"2025-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11911747/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Pakistan Journal of Medical Sciences","FirstCategoryId":"3","ListUrlMain":"https://doi.org/10.12669/pjms.41.3.10693","RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"MEDICINE, GENERAL & INTERNAL","Score":null,"Total":0}
引用次数: 0

Abstract

Objective: MCQ exams are part of end-module assessments in undergraduate medical institutions. Item Analysis (IA) is the best tool to check their reliability and validity. It provides the Reliability Coefficient KR20, Difficulty Index (DI), Discrimination Index (DISC), and Distractor Efficiency (DE). Almost all research papers on IA are based on single exam analysis. We examined the IA of multiple exams of the same module, taken in four years. We aimed to explore the required consistency over the years and the effectiveness of IA-based post-exam measures.

Methodology: Item Analysis of eight final MCQ exams of the Pediatric module from 2020-21 to 2023-24, at the Faculty of Medicine in Rabigh, King Abdulaziz University, Saudi Arabia, were included in the study.

Results: All exams had KR20 of 90 and above indicating excellent reliability. Difficulty levels were consistent except for a single year. Discriminative ability was maintained over the years. Only 28 out of 800 MCQs had a negative DISC. All exams maintained good DE. Only 15 MCQs over four years had zero DE. The practice of reviewing all Non-Functional Distractors yielded a gradual improvement in exam quality.

Conclusion: Besides the IA of individual exams, it is also recommended that IA of the same exam be evaluated over 4-5 years to see consistency and trends towards improvement. It helps in improving the reliability and validity by addressing deficiencies and deviations from the recommended standards.

有效使用项目分析提高本科医学考试的信度和效度:评估多年的同一考试:不同的方法。
目的:MCQ考试是医学院校本科毕业考核的组成部分。项目分析(IA)是检验其信效度的最佳工具。它提供了可靠性系数KR20、难度指数(DI)、辨别指数(DISC)和干扰效率(DE)。几乎所有关于IA的研究论文都是基于单一考试分析。我们检查了同一模块的多个考试的IA,在四年中完成。我们的目的是探索多年来基于ia的考试后措施所需的一致性和有效性。方法:项目分析纳入沙特阿拉伯阿卜杜勒阿齐兹国王大学Rabigh医学院2020-21至2023-24年儿科模块的8次期末MCQ考试。结果:所有考试的KR20均在90及以上,信度极佳。难度等级是一致的,只有一年例外。多年来一直保持着辨别力。800名mcq中只有28名DISC呈阴性。所有的考试都保持了良好的DE。在四年中,只有15个mcq的DE为零。复习所有非功能性干扰物的做法使考试质量逐步提高。结论:除了单项考试的IA外,还建议对同一考试的IA进行4-5年的评估,以观察一致性和改进趋势。它通过解决推荐标准的缺陷和偏差,有助于提高可靠性和有效性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Pakistan Journal of Medical Sciences
Pakistan Journal of Medical Sciences 医学-医学:内科
CiteScore
4.10
自引率
9.10%
发文量
363
审稿时长
3-6 weeks
期刊介绍: It is a peer reviewed medical journal published regularly since 1984. It was previously known as quarterly "SPECIALIST" till December 31st 1999. It publishes original research articles, review articles, current practices, short communications & case reports. It attracts manuscripts not only from within Pakistan but also from over fifty countries from abroad. Copies of PJMS are sent to all the import medical libraries all over Pakistan and overseas particularly in South East Asia and Asia Pacific besides WHO EMRO Region countries. Eminent members of the medical profession at home and abroad regularly contribute their write-ups, manuscripts in our publications. We pursue an independent editorial policy, which allows an opportunity to the healthcare professionals to express their views without any fear or favour. That is why many opinion makers among the medical and pharmaceutical profession use this publication to communicate their viewpoint.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信