Solving Not Answering. Validation of Guidance for Writing Higher-Order Multiple-Choice Questions in Medical Science Education.

IF 1.9 Q2 EDUCATION, SCIENTIFIC DISCIPLINES
Medical Science Educator Pub Date : 2024-08-20 eCollection Date: 2024-12-01 DOI:10.1007/s40670-024-02140-7
Maria Xiromeriti, Philip M Newton
{"title":"<i>Solving Not Answering</i>. Validation of Guidance for Writing Higher-Order Multiple-Choice Questions in Medical Science Education.","authors":"Maria Xiromeriti, Philip M Newton","doi":"10.1007/s40670-024-02140-7","DOIUrl":null,"url":null,"abstract":"<p><p>Problem-solving and higher-order learning are goals of higher education. It has been repeatedly suggested that multiple-choice questions (MCQs) can be used to test higher-order learning, although objective empirical evidence is lacking and MCQs are often criticised for assessing only lower-order, factual, or 'rote' learning. These challenges are compounded by a lack of agreement on what constitutes higher order learning: it is normally defined subjectively using heavily criticised frameworks such as such as Bloom's taxonomy. There is also a lack of agreement on how to write MCQs which assess higher order learning. Here we tested guidance for the creation of MCQs to assess higher-order learning, by evaluating the performance of students who were subject matter novices, vs experts. We found that questions written using the guidance were much harder to answer when students had no prior subject knowledge, whereas lower-order questions could be answered by simply searching online. These findings suggest that questions written using the guidance do indeed test higher-order learning, and such MCQs may be a valid alternative to other written assessment formats designed to test higher-order learning, such as essays, where reliability and cheating are a major concern.</p><p><strong>Supplementary information: </strong>The online version contains supplementary material available at 10.1007/s40670-024-02140-7.</p>","PeriodicalId":37113,"journal":{"name":"Medical Science Educator","volume":"34 6","pages":"1469-1477"},"PeriodicalIF":1.9000,"publicationDate":"2024-08-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11698704/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Medical Science Educator","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1007/s40670-024-02140-7","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2024/12/1 0:00:00","PubModel":"eCollection","JCR":"Q2","JCRName":"EDUCATION, SCIENTIFIC DISCIPLINES","Score":null,"Total":0}
引用次数: 0

Abstract

Problem-solving and higher-order learning are goals of higher education. It has been repeatedly suggested that multiple-choice questions (MCQs) can be used to test higher-order learning, although objective empirical evidence is lacking and MCQs are often criticised for assessing only lower-order, factual, or 'rote' learning. These challenges are compounded by a lack of agreement on what constitutes higher order learning: it is normally defined subjectively using heavily criticised frameworks such as such as Bloom's taxonomy. There is also a lack of agreement on how to write MCQs which assess higher order learning. Here we tested guidance for the creation of MCQs to assess higher-order learning, by evaluating the performance of students who were subject matter novices, vs experts. We found that questions written using the guidance were much harder to answer when students had no prior subject knowledge, whereas lower-order questions could be answered by simply searching online. These findings suggest that questions written using the guidance do indeed test higher-order learning, and such MCQs may be a valid alternative to other written assessment formats designed to test higher-order learning, such as essays, where reliability and cheating are a major concern.

Supplementary information: The online version contains supplementary material available at 10.1007/s40670-024-02140-7.

解决不回答。医学教育中高阶选择题写作指南的验证。
解决问题和高阶学习是高等教育的目标。多项选择题(mcq)可以用来测试高阶学习,尽管缺乏客观的经验证据,mcq经常被批评为只评估低阶、事实性或“死记硬背”的学习。这些挑战由于对高阶学习的构成缺乏共识而变得更加复杂:高阶学习通常是用布鲁姆分类法等饱受批评的框架主观定义的。对于如何编写评估高阶学习的mcq也缺乏共识。在这里,我们通过评估学科新手和专家学生的表现,测试了创建mcq以评估高阶学习的指导。我们发现,当学生没有预先的学科知识时,使用指导写的问题更难回答,而低阶问题可以通过简单的在线搜索来回答。这些发现表明,使用指南编写的问题确实可以测试高阶学习,这样的mcq可能是用于测试高阶学习的其他书面评估格式的有效替代方案,例如论文,其中可靠性和作弊是主要问题。补充资料:在线版本提供补充资料,网址为10.1007/s40670-024-02140-7。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Medical Science Educator
Medical Science Educator Social Sciences-Education
CiteScore
2.90
自引率
11.80%
发文量
202
期刊介绍: Medical Science Educator is the successor of the journal JIAMSE. It is the peer-reviewed publication of the International Association of Medical Science Educators (IAMSE). The Journal offers all who teach in healthcare the most current information to succeed in their task by publishing scholarly activities, opinions, and resources in medical science education. Published articles focus on teaching the sciences fundamental to modern medicine and health, and include basic science education, clinical teaching, and the use of modern education technologies. The Journal provides the readership a better understanding of teaching and learning techniques in order to advance medical science education.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信