{"title":"认知诊断模型中解释性混淆的可能性。","authors":"Qi Helen Huang, Daniel M Bolt","doi":"10.1177/01466216221084207","DOIUrl":null,"url":null,"abstract":"<p><p>Binary examinee mastery/nonmastery classifications in cognitive diagnosis models may often be an approximation to proficiencies that are better regarded as continuous. Such misspecification can lead to inconsistencies in the operational definition of \"mastery\" when binary skills models are assumed. In this paper we demonstrate the potential for an interpretational confounding of the latent skills when truly continuous skills are treated as binary. Using the DINA model as an example, we show how such forms of confounding can be observed through item and/or examinee parameter change when (1) different collections of items (such as representing different test forms) previously calibrated separately are subsequently calibrated together; and (2) when structural restrictions are placed on the relationships among skill attributes (such as the assumption of strictly nonnegative growth over time), among other possibilities. We examine these occurrences in both simulation and real data studies. It is suggested that researchers should regularly attend to the potential for interpretational confounding by studying differences in attribute mastery proportions and/or changes in item parameter (e.g., slip and guess) estimates attributable to skill continuity when the same samples of examinees are administered different test forms, or the same test forms are involved in different calibrations.</p>","PeriodicalId":48300,"journal":{"name":"Applied Psychological Measurement","volume":null,"pages":null},"PeriodicalIF":1.0000,"publicationDate":"2022-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9118932/pdf/10.1177_01466216221084207.pdf","citationCount":"0","resultStr":"{\"title\":\"The Potential for Interpretational Confounding in Cognitive Diagnosis Models.\",\"authors\":\"Qi Helen Huang, Daniel M Bolt\",\"doi\":\"10.1177/01466216221084207\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>Binary examinee mastery/nonmastery classifications in cognitive diagnosis models may often be an approximation to proficiencies that are better regarded as continuous. Such misspecification can lead to inconsistencies in the operational definition of \\\"mastery\\\" when binary skills models are assumed. In this paper we demonstrate the potential for an interpretational confounding of the latent skills when truly continuous skills are treated as binary. Using the DINA model as an example, we show how such forms of confounding can be observed through item and/or examinee parameter change when (1) different collections of items (such as representing different test forms) previously calibrated separately are subsequently calibrated together; and (2) when structural restrictions are placed on the relationships among skill attributes (such as the assumption of strictly nonnegative growth over time), among other possibilities. We examine these occurrences in both simulation and real data studies. It is suggested that researchers should regularly attend to the potential for interpretational confounding by studying differences in attribute mastery proportions and/or changes in item parameter (e.g., slip and guess) estimates attributable to skill continuity when the same samples of examinees are administered different test forms, or the same test forms are involved in different calibrations.</p>\",\"PeriodicalId\":48300,\"journal\":{\"name\":\"Applied Psychological Measurement\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":1.0000,\"publicationDate\":\"2022-06-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9118932/pdf/10.1177_01466216221084207.pdf\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Applied Psychological Measurement\",\"FirstCategoryId\":\"102\",\"ListUrlMain\":\"https://doi.org/10.1177/01466216221084207\",\"RegionNum\":4,\"RegionCategory\":\"心理学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"2022/4/15 0:00:00\",\"PubModel\":\"Epub\",\"JCR\":\"Q4\",\"JCRName\":\"PSYCHOLOGY, MATHEMATICAL\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Applied Psychological Measurement","FirstCategoryId":"102","ListUrlMain":"https://doi.org/10.1177/01466216221084207","RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2022/4/15 0:00:00","PubModel":"Epub","JCR":"Q4","JCRName":"PSYCHOLOGY, MATHEMATICAL","Score":null,"Total":0}
引用次数: 0
摘要
在认知诊断模型中,受试者掌握/不掌握的二元分类往往可能是能力的近似值,而这些能力最好被看作是连续的。当假定采用二元技能模型时,这种错误定义可能会导致 "掌握 "的操作定义不一致。在本文中,我们展示了当真正连续的技能被视为二进制技能时,潜在技能的解释混淆的可能性。以 DINA 模型为例,我们展示了在以下情况下如何通过项目和/或考生参数的变化观察到这种形式的混淆:(1) 先前分别校准的不同项目集合(如代表不同测试形式)随后被一起校准;(2) 对技能属性之间的关系施加结构性限制(如假定随时间的增长为严格的非负值),以及其他可能性。我们在模拟和真实数据研究中对这些情况进行了考察。我们建议,研究人员应定期关注解释性混淆的可能性,研究当相同的考生样本接受不同的测试形式,或相同的测试形式参与不同的校准时,属性掌握比例的差异和/或项目参数(如滑动和猜测)估计值因技能连续性而产生的变化。
The Potential for Interpretational Confounding in Cognitive Diagnosis Models.
Binary examinee mastery/nonmastery classifications in cognitive diagnosis models may often be an approximation to proficiencies that are better regarded as continuous. Such misspecification can lead to inconsistencies in the operational definition of "mastery" when binary skills models are assumed. In this paper we demonstrate the potential for an interpretational confounding of the latent skills when truly continuous skills are treated as binary. Using the DINA model as an example, we show how such forms of confounding can be observed through item and/or examinee parameter change when (1) different collections of items (such as representing different test forms) previously calibrated separately are subsequently calibrated together; and (2) when structural restrictions are placed on the relationships among skill attributes (such as the assumption of strictly nonnegative growth over time), among other possibilities. We examine these occurrences in both simulation and real data studies. It is suggested that researchers should regularly attend to the potential for interpretational confounding by studying differences in attribute mastery proportions and/or changes in item parameter (e.g., slip and guess) estimates attributable to skill continuity when the same samples of examinees are administered different test forms, or the same test forms are involved in different calibrations.
期刊介绍:
Applied Psychological Measurement publishes empirical research on the application of techniques of psychological measurement to substantive problems in all areas of psychology and related disciplines.