多选题项目的质量:项目分析

Ayenew Takele Alemu, Hiwot Tesfa, Addisu Mulugeta, E. Fenta, Mahider Awoke Belay
{"title":"多选题项目的质量:项目分析","authors":"Ayenew Takele Alemu, Hiwot Tesfa, Addisu Mulugeta, E. Fenta, Mahider Awoke Belay","doi":"10.18203/issn.2454-2156.intjscirep20241316","DOIUrl":null,"url":null,"abstract":"Background: There are different types of exam formats for educational assessment. Multiple choice questions (MCQs) are frequently utilized assessment tools in health education. Considering the reliability and validity in developing MCQ items is vital. Educators often face the difficulty of developing credible distractors in MCQ items. Poorly constructed MCQ items make an exam easier or too difficult to be answered correctly by students as intended learning objectives. Checking the quality of MCQ items is overlooked and too little is known about it. Therefore, this study aimed to assess the quality of MCQ items using the item response theory model. \nMethods: A descriptive cross-sectional study was conducted among MCQ items of public health courses administered to 2nd year nursing students at Injibara university. A total of 50 MCQ items and 200 alternatives were evaluated for statistical item analysis. The quality of MCQ items was assessed by difficulty index (DIF), discrimination index (DI), and distractor efficiency (DE) using students’ exam responses. Microsoft excel sheet and SPSS version 26 were used for data management and analysis. \nResults: Post-exam item analysis showed that 11 (22%) and 22 (44%) MCQs had too difficult and poor ranges for difficulty and discriminating powers respectively. The overall DE was 71.3%. About forty (20%) distractors were non-functional. Only 8 (16%) MCQs fulfilled the recommended criteria for all-DIF, DI, and DE parameters. \nConclusions: The desirable criteria for quality parameters of MCQ items were satisfied only in a few items. The result implies the need for quality improvement. Continuous trainings are required to improve the instructors’ skills to construct quality educational assessment tools.","PeriodicalId":14297,"journal":{"name":"International Journal of Scientific Reports","volume":"54 20","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-05-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Quality of multiple choice question items: item analysis\",\"authors\":\"Ayenew Takele Alemu, Hiwot Tesfa, Addisu Mulugeta, E. Fenta, Mahider Awoke Belay\",\"doi\":\"10.18203/issn.2454-2156.intjscirep20241316\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Background: There are different types of exam formats for educational assessment. Multiple choice questions (MCQs) are frequently utilized assessment tools in health education. Considering the reliability and validity in developing MCQ items is vital. Educators often face the difficulty of developing credible distractors in MCQ items. Poorly constructed MCQ items make an exam easier or too difficult to be answered correctly by students as intended learning objectives. Checking the quality of MCQ items is overlooked and too little is known about it. Therefore, this study aimed to assess the quality of MCQ items using the item response theory model. \\nMethods: A descriptive cross-sectional study was conducted among MCQ items of public health courses administered to 2nd year nursing students at Injibara university. A total of 50 MCQ items and 200 alternatives were evaluated for statistical item analysis. The quality of MCQ items was assessed by difficulty index (DIF), discrimination index (DI), and distractor efficiency (DE) using students’ exam responses. Microsoft excel sheet and SPSS version 26 were used for data management and analysis. \\nResults: Post-exam item analysis showed that 11 (22%) and 22 (44%) MCQs had too difficult and poor ranges for difficulty and discriminating powers respectively. The overall DE was 71.3%. About forty (20%) distractors were non-functional. Only 8 (16%) MCQs fulfilled the recommended criteria for all-DIF, DI, and DE parameters. \\nConclusions: The desirable criteria for quality parameters of MCQ items were satisfied only in a few items. The result implies the need for quality improvement. Continuous trainings are required to improve the instructors’ skills to construct quality educational assessment tools.\",\"PeriodicalId\":14297,\"journal\":{\"name\":\"International Journal of Scientific Reports\",\"volume\":\"54 20\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-05-23\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International Journal of Scientific Reports\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.18203/issn.2454-2156.intjscirep20241316\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Scientific Reports","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.18203/issn.2454-2156.intjscirep20241316","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

背景:教育评估有不同类型的考试形式。多项选择题(MCQ)是健康教育中经常使用的评估工具。在开发 MCQ 题目时,对信度和效度的考虑至关重要。教育工作者通常面临的困难是如何在 MCQ 题目中开发可信的干扰项。构造不良的 MCQ 题项会使考试变得过于简单或过于困难,从而使学生无法按照预期的学习目标正确作答。检查 MCQ 项目的质量被忽视了,而且人们对此知之甚少。因此,本研究旨在利用项目反应理论模型评估 MCQ 项目的质量。研究方法对仁吉巴拉大学护理专业二年级学生公共卫生课程的 MCQ 项目进行了描述性横断面研究。共评估了 50 个 MCQ 项目和 200 个备选项目,并进行了项目统计分析。利用学生的考试答卷,通过难度指数(DIF)、区分度指数(DI)和分心效率(DE)来评估 MCQ 项目的质量。数据管理和分析使用了 Microsoft excel 表格和 SPSS 26 版本。结果考后项目分析显示,分别有 11 道(22%)和 22 道(44%)单项选择题的难度和区分度范围过大和过小。总体得分率为 71.3%。约有 40 个(20%)干扰项不起作用。只有 8 道(16%)单项智力题符合所有 DIF、DI 和 DE 参数的建议标准。结论只有少数题目符合 MCQ 质量参数的理想标准。这一结果表明有必要改进质量。需要持续开展培训,以提高教师构建高质量教育评估工具的技能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Quality of multiple choice question items: item analysis
Background: There are different types of exam formats for educational assessment. Multiple choice questions (MCQs) are frequently utilized assessment tools in health education. Considering the reliability and validity in developing MCQ items is vital. Educators often face the difficulty of developing credible distractors in MCQ items. Poorly constructed MCQ items make an exam easier or too difficult to be answered correctly by students as intended learning objectives. Checking the quality of MCQ items is overlooked and too little is known about it. Therefore, this study aimed to assess the quality of MCQ items using the item response theory model. Methods: A descriptive cross-sectional study was conducted among MCQ items of public health courses administered to 2nd year nursing students at Injibara university. A total of 50 MCQ items and 200 alternatives were evaluated for statistical item analysis. The quality of MCQ items was assessed by difficulty index (DIF), discrimination index (DI), and distractor efficiency (DE) using students’ exam responses. Microsoft excel sheet and SPSS version 26 were used for data management and analysis. Results: Post-exam item analysis showed that 11 (22%) and 22 (44%) MCQs had too difficult and poor ranges for difficulty and discriminating powers respectively. The overall DE was 71.3%. About forty (20%) distractors were non-functional. Only 8 (16%) MCQs fulfilled the recommended criteria for all-DIF, DI, and DE parameters. Conclusions: The desirable criteria for quality parameters of MCQ items were satisfied only in a few items. The result implies the need for quality improvement. Continuous trainings are required to improve the instructors’ skills to construct quality educational assessment tools.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信