{"title":"Data-driven Q-matrix validation using a residual-based statistic in cognitive diagnostic assessment","authors":"Xiaofeng Yu, Ying Cheng","doi":"10.1111/bmsp.12191","DOIUrl":null,"url":null,"abstract":"<p>In a cognitive diagnostic assessment (CDA), attributes refer to fine-grained knowledge points or skills. The <b>Q</b>-matrix is a central component of CDA, which specifies the relationship between items and attributes. Oftentimes, attributes and <b>Q</b>-matrix are defined by subject-matter experts, and assumed to be appropriate without any misspecifications. However, this assumption does not always hold in real applications. To address this concern, this paper proposes a residual-based statistic for validating the <b>Q</b>-matrix. Its performance is evaluated in a simulation study and compared against that of an existing method proposed in Liu, Xu and Ying (2012, <i>Applied Psychological Measurement</i>, 36, 548). Simulation results indicate that the proposed method leads to a higher recovery rate of the <b>Q</b>-matrix and is computationally more efficient. The advantage in computational efficiency is particularly pronounced when the number of attributes measured by the test reaches five or more. Results also suggest that the two methods have different tendencies in estimating the attribute vector for each item. In cases where the methods fail to recover the correct <b>Q</b>-matrix, the method in Liu et al. (2012, Applied Psychological Measurement, 36, 548) tends to overestimate the number of attributes measured by the items, whereas our method does not show that bias.</p>","PeriodicalId":55322,"journal":{"name":"British Journal of Mathematical & Statistical Psychology","volume":"73 S1","pages":"145-179"},"PeriodicalIF":1.8000,"publicationDate":"2019-11-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1111/bmsp.12191","citationCount":"16","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"British Journal of Mathematical & Statistical Psychology","FirstCategoryId":"102","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1111/bmsp.12191","RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"MATHEMATICS, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
引用次数: 16
Abstract
In a cognitive diagnostic assessment (CDA), attributes refer to fine-grained knowledge points or skills. The Q-matrix is a central component of CDA, which specifies the relationship between items and attributes. Oftentimes, attributes and Q-matrix are defined by subject-matter experts, and assumed to be appropriate without any misspecifications. However, this assumption does not always hold in real applications. To address this concern, this paper proposes a residual-based statistic for validating the Q-matrix. Its performance is evaluated in a simulation study and compared against that of an existing method proposed in Liu, Xu and Ying (2012, Applied Psychological Measurement, 36, 548). Simulation results indicate that the proposed method leads to a higher recovery rate of the Q-matrix and is computationally more efficient. The advantage in computational efficiency is particularly pronounced when the number of attributes measured by the test reaches five or more. Results also suggest that the two methods have different tendencies in estimating the attribute vector for each item. In cases where the methods fail to recover the correct Q-matrix, the method in Liu et al. (2012, Applied Psychological Measurement, 36, 548) tends to overestimate the number of attributes measured by the items, whereas our method does not show that bias.
期刊介绍:
The British Journal of Mathematical and Statistical Psychology publishes articles relating to areas of psychology which have a greater mathematical or statistical aspect of their argument than is usually acceptable to other journals including:
• mathematical psychology
• statistics
• psychometrics
• decision making
• psychophysics
• classification
• relevant areas of mathematics, computing and computer software
These include articles that address substantitive psychological issues or that develop and extend techniques useful to psychologists. New models for psychological processes, new approaches to existing data, critiques of existing models and improved algorithms for estimating the parameters of a model are examples of articles which may be favoured.