{"title":"研究生临床住院医师:选择题质量对考试成功率的影响。","authors":"Omer Eladil Abdalla Hamid Mohammed, Suresh Kumar Srinivasamurthy, Raghavendra Bhat, Yasir Ahmed Mohamed Alhassan Eltahir, Hesham Amin Hamdy Elshamly, Fatima Mohammed, Bashir Hamad","doi":"10.2147/AMEP.S525828","DOIUrl":null,"url":null,"abstract":"<p><strong>Objective: </strong>To examine the impact of quality parameters in the construction of multiple-choice questions (MCQs) and their associated psychometric analysis for a selected Specialty X (SpX) in the Qualifying Residency Entry Exam (QRE) at a Postgraduate Medical Institute.</p><p><strong>Methods: </strong>A post-validation cross-sectional analytical study was conducted using a non-probability purposive judgmental sampling technique. The SpX was chosen from one clinical specialities with the lowest exam success rates among the 52 specialities in the 2020-2023 QRE cycles. MCQs were evaluated using standard item analysis parameters: questions were considered acceptable range if they had a difficulty index (DIF) between 0.30-0.70, a discrimination index ≥0.2, and at least two functioning distractors.</p><p><strong>Results: </strong>Out of 175 candidates who appeared for the QRE, only 19 (10.86%) passed. The exam included 120 A-type MCQs, with just 7 (5.8%) flaw-free Items/Questions. Most questions (98.3%) lacked clinical vignettes, and only 10% used the proper lead-in format. Two-thirds failed the \"cover-the-options\" test, and 40% showed constructional flaws related to testwiseness or irrelevant difficulty. Psychometric analysis showed a mean difficulty index of 45.9, with 86.7% of Items/Questions in the acceptable range. However, 15% had extremely poor discrimination (mean PBS = 0.17), and the mean distractor efficiency was 66%. A statistically significant relationship (p < 0.05) was observed between constructional flaws and DIF, DisI/PBS, the Horst Index, and Bloom's levels. Furthermore, no significant relationship was identified between the exam success rate and the type of MBBS curriculum.</p><p><strong>Conclusion: </strong>The quality of Items/Questions in the postgraduate residency significantly impacted the QRE. Other potentially influential factors require further multivariate analytical research. This highlights the need for strategic educational initiatives to enhance Exam Bank development, strengthen capacity building, and improve faculty assessment skills.</p>","PeriodicalId":47404,"journal":{"name":"Advances in Medical Education and Practice","volume":"16 ","pages":"1381-1397"},"PeriodicalIF":1.7000,"publicationDate":"2025-08-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12335845/pdf/","citationCount":"0","resultStr":"{\"title\":\"Postgraduate Clinical Residency: The Impact of Multiple-Choice Question Quality on Exam Success Rates.\",\"authors\":\"Omer Eladil Abdalla Hamid Mohammed, Suresh Kumar Srinivasamurthy, Raghavendra Bhat, Yasir Ahmed Mohamed Alhassan Eltahir, Hesham Amin Hamdy Elshamly, Fatima Mohammed, Bashir Hamad\",\"doi\":\"10.2147/AMEP.S525828\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><strong>Objective: </strong>To examine the impact of quality parameters in the construction of multiple-choice questions (MCQs) and their associated psychometric analysis for a selected Specialty X (SpX) in the Qualifying Residency Entry Exam (QRE) at a Postgraduate Medical Institute.</p><p><strong>Methods: </strong>A post-validation cross-sectional analytical study was conducted using a non-probability purposive judgmental sampling technique. The SpX was chosen from one clinical specialities with the lowest exam success rates among the 52 specialities in the 2020-2023 QRE cycles. MCQs were evaluated using standard item analysis parameters: questions were considered acceptable range if they had a difficulty index (DIF) between 0.30-0.70, a discrimination index ≥0.2, and at least two functioning distractors.</p><p><strong>Results: </strong>Out of 175 candidates who appeared for the QRE, only 19 (10.86%) passed. The exam included 120 A-type MCQs, with just 7 (5.8%) flaw-free Items/Questions. Most questions (98.3%) lacked clinical vignettes, and only 10% used the proper lead-in format. Two-thirds failed the \\\"cover-the-options\\\" test, and 40% showed constructional flaws related to testwiseness or irrelevant difficulty. Psychometric analysis showed a mean difficulty index of 45.9, with 86.7% of Items/Questions in the acceptable range. However, 15% had extremely poor discrimination (mean PBS = 0.17), and the mean distractor efficiency was 66%. A statistically significant relationship (p < 0.05) was observed between constructional flaws and DIF, DisI/PBS, the Horst Index, and Bloom's levels. Furthermore, no significant relationship was identified between the exam success rate and the type of MBBS curriculum.</p><p><strong>Conclusion: </strong>The quality of Items/Questions in the postgraduate residency significantly impacted the QRE. Other potentially influential factors require further multivariate analytical research. This highlights the need for strategic educational initiatives to enhance Exam Bank development, strengthen capacity building, and improve faculty assessment skills.</p>\",\"PeriodicalId\":47404,\"journal\":{\"name\":\"Advances in Medical Education and Practice\",\"volume\":\"16 \",\"pages\":\"1381-1397\"},\"PeriodicalIF\":1.7000,\"publicationDate\":\"2025-08-06\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12335845/pdf/\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Advances in Medical Education and Practice\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.2147/AMEP.S525828\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"2025/1/1 0:00:00\",\"PubModel\":\"eCollection\",\"JCR\":\"Q2\",\"JCRName\":\"EDUCATION, SCIENTIFIC DISCIPLINES\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Advances in Medical Education and Practice","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.2147/AMEP.S525828","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2025/1/1 0:00:00","PubModel":"eCollection","JCR":"Q2","JCRName":"EDUCATION, SCIENTIFIC DISCIPLINES","Score":null,"Total":0}
Postgraduate Clinical Residency: The Impact of Multiple-Choice Question Quality on Exam Success Rates.
Objective: To examine the impact of quality parameters in the construction of multiple-choice questions (MCQs) and their associated psychometric analysis for a selected Specialty X (SpX) in the Qualifying Residency Entry Exam (QRE) at a Postgraduate Medical Institute.
Methods: A post-validation cross-sectional analytical study was conducted using a non-probability purposive judgmental sampling technique. The SpX was chosen from one clinical specialities with the lowest exam success rates among the 52 specialities in the 2020-2023 QRE cycles. MCQs were evaluated using standard item analysis parameters: questions were considered acceptable range if they had a difficulty index (DIF) between 0.30-0.70, a discrimination index ≥0.2, and at least two functioning distractors.
Results: Out of 175 candidates who appeared for the QRE, only 19 (10.86%) passed. The exam included 120 A-type MCQs, with just 7 (5.8%) flaw-free Items/Questions. Most questions (98.3%) lacked clinical vignettes, and only 10% used the proper lead-in format. Two-thirds failed the "cover-the-options" test, and 40% showed constructional flaws related to testwiseness or irrelevant difficulty. Psychometric analysis showed a mean difficulty index of 45.9, with 86.7% of Items/Questions in the acceptable range. However, 15% had extremely poor discrimination (mean PBS = 0.17), and the mean distractor efficiency was 66%. A statistically significant relationship (p < 0.05) was observed between constructional flaws and DIF, DisI/PBS, the Horst Index, and Bloom's levels. Furthermore, no significant relationship was identified between the exam success rate and the type of MBBS curriculum.
Conclusion: The quality of Items/Questions in the postgraduate residency significantly impacted the QRE. Other potentially influential factors require further multivariate analytical research. This highlights the need for strategic educational initiatives to enhance Exam Bank development, strengthen capacity building, and improve faculty assessment skills.