Examining Group Differences in Mathematics Achievement: Explanatory Item Response Model Application

Zeynep Zelal KIZILKAYA
{"title":"Examining Group Differences in Mathematics Achievement: Explanatory Item Response Model Application","authors":"Zeynep Zelal KIZILKAYA","doi":"10.26466/opusjsr.1372994","DOIUrl":null,"url":null,"abstract":"Students take many different exams throughout their educational lives. In these exams, various individual and item characteristics can affect the responses of individuals to the items. In this study, it was aimed to examine the effects of person and item predictors on the mathematics common exam results of 365 9th grade students with explanatory item response models. Gender and school type as person variables and cognitive domain, content domain and booklet type as item variables were added to the models due to their widespread inclusion in the literature. When the predicted item parameters were examined, it was seen that the smallest parameter values were obtained for all items with the Rasch model. When the model data fit values of four different models were examined, it was concluded that the latent regression and latent regression linear logistic test models showed better fit than the Rasch model. By adding person and item predictors to the model, the parameters obtained for each variable group were compared, and differences were observed between the groups for school type, cognitive domain, and content domain variables. It was concluded that the item parameters did not differ for the variables of gender and booklet type. It is thought that it would be beneficial to use these models more widely in studies to be conducted in the field of education and psychology, since they provide more detailed information about the reasons for the differences in the estimated parameters.","PeriodicalId":477188,"journal":{"name":"OPUS Journal of Society Research","volume":"117 2","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-10-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"OPUS Journal of Society Research","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.26466/opusjsr.1372994","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Students take many different exams throughout their educational lives. In these exams, various individual and item characteristics can affect the responses of individuals to the items. In this study, it was aimed to examine the effects of person and item predictors on the mathematics common exam results of 365 9th grade students with explanatory item response models. Gender and school type as person variables and cognitive domain, content domain and booklet type as item variables were added to the models due to their widespread inclusion in the literature. When the predicted item parameters were examined, it was seen that the smallest parameter values were obtained for all items with the Rasch model. When the model data fit values of four different models were examined, it was concluded that the latent regression and latent regression linear logistic test models showed better fit than the Rasch model. By adding person and item predictors to the model, the parameters obtained for each variable group were compared, and differences were observed between the groups for school type, cognitive domain, and content domain variables. It was concluded that the item parameters did not differ for the variables of gender and booklet type. It is thought that it would be beneficial to use these models more widely in studies to be conducted in the field of education and psychology, since they provide more detailed information about the reasons for the differences in the estimated parameters.
研究数学成绩的群体差异:解释性项目反应模型的应用
学生在他们的教育生涯中参加许多不同的考试。在这些测试中,不同的个体和项目特征会影响个体对项目的反应。摘要本研究以365名初二学生为研究对象,采用解释性的项目反应模型,探讨个人和项目预测因子对数学普通考试成绩的影响。性别和学校类型作为人变量,认知领域、内容领域和小册子类型作为项目变量被添加到模型中,因为它们在文献中被广泛包含。当检查预测的项目参数时,可以看到使用Rasch模型获得的所有项目的参数值最小。对四种不同模型的模型数据拟合值进行检验,发现潜回归和潜回归线性logistic检验模型的拟合效果优于Rasch模型。通过在模型中加入人物和项目预测因子,比较各组变量的参数,并观察各组在学校类型、认知领域和内容领域变量上的差异。结果表明,性别和小册子类型变量的项目参数没有差异。人们认为,在教育和心理学领域的研究中更广泛地使用这些模型是有益的,因为它们提供了关于估计参数差异的原因的更详细的信息。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信