Applied Psychological Measurement最新文献

筛选
英文 中文
glca: An R Package for Multiple-Group Latent Class Analysis. glca:用于多组潜类分析的 R 软件包。
IF 1 4区 心理学
Applied Psychological Measurement Pub Date : 2022-07-01 Epub Date: 2022-05-11 DOI: 10.1177/01466216221084197
Youngsun Kim, Saebom Jeon, Chi Chang, Hwan Chung
{"title":"glca: An R Package for Multiple-Group Latent Class Analysis.","authors":"Youngsun Kim, Saebom Jeon, Chi Chang, Hwan Chung","doi":"10.1177/01466216221084197","DOIUrl":"10.1177/01466216221084197","url":null,"abstract":"<p><p>Group similarities and differences may manifest themselves in a variety of ways in multiple-group latent class analysis (LCA). Sometimes, measurement models are identical across groups in LCA. In other situations, the measurement models may differ, suggesting that the latent structure itself is different between groups. Tests of measurement invariance shed light on this distinction. We created an R package glca that implements procedures for exploring differences in latent class structure between populations, taking multilevel data structure into account. The glca package deals with the fixed-effect LCA and the nonparametric random-effect LCA; the former can be applied in the situation where populations are segmented by the observed group variable itself, whereas the latter can be used when there are too many levels in the group variable to make a meaningful group comparisons by identifying a group-level latent variable. The glca package consists of functions for statistical test procedures for exploring group differences in various LCA models considering multilevel data structure.</p>","PeriodicalId":48300,"journal":{"name":"Applied Psychological Measurement","volume":"46 5","pages":"439-441"},"PeriodicalIF":1.0,"publicationDate":"2022-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9265491/pdf/10.1177_01466216221084197.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"10091269","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Bridging Models of Biometric and Psychometric Assessment: A Three-Way Joint Modeling Approach of Item Responses, Response Times, and Gaze Fixation Counts. 生物计量和心理计量评估的桥接模型:项目反应、反应时间和注视计数的三方联合建模方法。
IF 1.2 4区 心理学
Applied Psychological Measurement Pub Date : 2022-07-01 DOI: 10.1177/01466216221089344
Kaiwen Man, Jeffrey R Harring, Peida Zhan
{"title":"Bridging Models of Biometric and Psychometric Assessment: A Three-Way Joint Modeling Approach of Item Responses, Response Times, and Gaze Fixation Counts.","authors":"Kaiwen Man,&nbsp;Jeffrey R Harring,&nbsp;Peida Zhan","doi":"10.1177/01466216221089344","DOIUrl":"https://doi.org/10.1177/01466216221089344","url":null,"abstract":"<p><p>Recently, joint models of item response data and response times have been proposed to better assess and understand test takers' learning processes. This article demonstrates how biometric information such as gaze fixation counts obtained from an eye-tracking machine can be integrated into the measurement model. The proposed joint modeling framework accommodates the relations among a test taker's latent ability, working speed and test engagement level via a person-side variance-covariance structure, while simultaneously permitting the modeling of item difficulty, time-intensity, and the engagement intensity through an item-side variance-covariance structure. A Bayesian estimation scheme is used to fit the proposed model to data. Posterior predictive model checking based on three discrepancy measures corresponding to various model components are introduced to assess model-data fit. Findings from a Monte Carlo simulation and results from analyzing experimental data demonstrate the utility of the model.</p>","PeriodicalId":48300,"journal":{"name":"Applied Psychological Measurement","volume":"46 5","pages":"361-381"},"PeriodicalIF":1.2,"publicationDate":"2022-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9265489/pdf/10.1177_01466216221089344.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"10091266","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Bayesian Item Response Theory Models With Flexible Generalized Logit Links. 具有灵活广义对数链接的贝叶斯项目反应理论模型
IF 1 4区 心理学
Applied Psychological Measurement Pub Date : 2022-07-01 Epub Date: 2022-05-20 DOI: 10.1177/01466216221089343
Jiwei Zhang, Ying-Ying Zhang, Jian Tao, Ming-Hui Chen
{"title":"Bayesian Item Response Theory Models With Flexible Generalized Logit Links.","authors":"Jiwei Zhang, Ying-Ying Zhang, Jian Tao, Ming-Hui Chen","doi":"10.1177/01466216221089343","DOIUrl":"10.1177/01466216221089343","url":null,"abstract":"<p><p>In educational and psychological research, the logit and probit links are often used to fit the binary item response data. The appropriateness and importance of the choice of links within the item response theory (IRT) framework has not been investigated yet. In this paper, we present a family of IRT models with generalized logit links, which include the traditional logistic and normal ogive models as special cases. This family of models are flexible enough not only to adjust the item characteristic curve tail probability by two shape parameters but also to allow us to fit the same link or different links to different items within the IRT model framework. In addition, the proposed models are implemented in the Stan software to sample from the posterior distributions. Using readily available Stan outputs, the four Bayesian model selection criteria are computed for guiding the choice of the links within the IRT model framework. Extensive simulation studies are conducted to examine the empirical performance of the proposed models and the model fittings in terms of \"in-sample\" and \"out-of-sample\" predictions based on the deviance. Finally, a detailed analysis of the real reading assessment data is carried out to illustrate the proposed methodology.</p>","PeriodicalId":48300,"journal":{"name":"Applied Psychological Measurement","volume":"46 5","pages":"382-405"},"PeriodicalIF":1.0,"publicationDate":"2022-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9265488/pdf/10.1177_01466216221089343.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"10091271","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Dual-Objective Item Selection Methods in Computerized Adaptive Test Using the Higher-Order Cognitive Diagnostic Models. 基于高阶认知诊断模型的计算机自适应测验双目标选题方法。
IF 1.2 4区 心理学
Applied Psychological Measurement Pub Date : 2022-07-01 DOI: 10.1177/01466216221089342
Chongqin Xi, Dongbo Tu, Yan Cai
{"title":"Dual-Objective Item Selection Methods in Computerized Adaptive Test Using the Higher-Order Cognitive Diagnostic Models.","authors":"Chongqin Xi,&nbsp;Dongbo Tu,&nbsp;Yan Cai","doi":"10.1177/01466216221089342","DOIUrl":"https://doi.org/10.1177/01466216221089342","url":null,"abstract":"<p><p>To efficiently obtain information about both the general abilities and detailed cognitive profiles of examinees from a single model that uses a single-calibration process, higher-order cognitive diagnostic computerized adaptive testing (CD-CAT) that employ higher-order cognitive diagnostic models have been developed. However, the current item selection methods used in higher-order CD-CAT adaptively select items according to only the attribute profiles, which might lead to low precision regarding general abilities; hence, an appropriate method was proposed for this CAT system in this study. Under the framework of the higher-order models, the responses were affected by attribute profiles, which were governed by general abilities. It is reasonable to hold that the item responses were affected by a combination of general abilities and attribute profiles. Based on the logic of Shannon entropy and the generalized deterministic, inputs, noisy \"and\" gate (G-DINA) model discrimination index (GDI), two new item selection methods were proposed for higher-order CD-CAT by considering the above combination in this study. The simulation results demonstrated that the new methods achieved more accurate estimations of both general abilities and cognitive profiles than the existing methods and maintained distinct advantages in terms of item pool usage.</p>","PeriodicalId":48300,"journal":{"name":"Applied Psychological Measurement","volume":"46 5","pages":"422-438"},"PeriodicalIF":1.2,"publicationDate":"2022-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9265487/pdf/10.1177_01466216221089342.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"10091270","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Evaluation of the Linear Composite Conjecture for Unidimensional IRT Scale for Multidimensional Responses. 多维响应下一维IRT尺度线性复合猜想的评价。
IF 1.2 4区 心理学
Applied Psychological Measurement Pub Date : 2022-07-01 DOI: 10.1177/01466216221084218
Tyler Strachan, Uk Hyun Cho, Terry Ackerman, Shyh-Huei Chen, Jimmy de la Torre, Edward H Ip
{"title":"Evaluation of the Linear Composite Conjecture for Unidimensional IRT Scale for Multidimensional Responses.","authors":"Tyler Strachan,&nbsp;Uk Hyun Cho,&nbsp;Terry Ackerman,&nbsp;Shyh-Huei Chen,&nbsp;Jimmy de la Torre,&nbsp;Edward H Ip","doi":"10.1177/01466216221084218","DOIUrl":"https://doi.org/10.1177/01466216221084218","url":null,"abstract":"<p><p>The linear composite direction represents, theoretically, where the unidimensional scale would lie within a multidimensional latent space. Using compensatory multidimensional IRT, the linear composite can be derived from the structure of the items and the latent distribution. The purpose of this study was to evaluate the validity of the linear composite conjecture and examine how well a fitted unidimensional IRT model approximates the linear composite direction in a multidimensional latent space. Simulation experiment results overall show that the fitted unidimensional IRT model sufficiently approximates linear composite direction when correlation between bivariate latent variables is positive. When the correlation between bivariate latent variables is negative, instability occurs when the fitted unidimensional IRT model is used to approximate linear composite direction. A real data experiment was also conducted using 20 items from a multiple-choice mathematics test from American College Testing.</p>","PeriodicalId":48300,"journal":{"name":"Applied Psychological Measurement","volume":"46 5","pages":"347-360"},"PeriodicalIF":1.2,"publicationDate":"2022-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9265490/pdf/10.1177_01466216221084218.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"10091268","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Computerized Adaptive Testing for Ipsative Tests with Multidimensional Pairwise-Comparison Items: Algorithm Development and Applications. 多维成对比较项目的互异性测试的计算机自适应测试:算法开发和应用。
IF 1.2 4区 心理学
Applied Psychological Measurement Pub Date : 2022-06-01 DOI: 10.1177/01466216221084209
Xue-Lan Qiu, Jimmy de la Torre, Sage Ro, Wen-Chung Wang
{"title":"Computerized Adaptive Testing for Ipsative Tests with Multidimensional Pairwise-Comparison Items: Algorithm Development and Applications.","authors":"Xue-Lan Qiu,&nbsp;Jimmy de la Torre,&nbsp;Sage Ro,&nbsp;Wen-Chung Wang","doi":"10.1177/01466216221084209","DOIUrl":"https://doi.org/10.1177/01466216221084209","url":null,"abstract":"<p><p>A computerized adaptive testing (CAT) solution for tests with multidimensional pairwise-comparison (MPC) items, aiming to measure career interest, value, and personality, is rare. This paper proposes new item selection and exposure control methods for CAT with dichotomous and polytomous MPC items and present simulation study results. The results show that the procedures are effective in selecting items and controlling within-person statement exposure with no loss of efficiency. Implications are discussed in two applications of the proposed CAT procedures: a work attitude test with dichotomous MPC items and a career interest assessment with polytomous MPC items.</p>","PeriodicalId":48300,"journal":{"name":"Applied Psychological Measurement","volume":"46 4","pages":"255-272"},"PeriodicalIF":1.2,"publicationDate":"2022-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9118927/pdf/10.1177_01466216221084209.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"9609917","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Combining Cognitive Diagnostic Computerized Adaptive Testing With Multidimensional Item Response Theory. 认知诊断计算机自适应测试与多维项目反应理论的结合。
IF 1.2 4区 心理学
Applied Psychological Measurement Pub Date : 2022-06-01 DOI: 10.1177/01466216221084214
Hao Luo, Daxun Wang, Zhiming Guo, Yan Cai, Dongbo Tu
{"title":"Combining Cognitive Diagnostic Computerized Adaptive Testing With Multidimensional Item Response Theory.","authors":"Hao Luo,&nbsp;Daxun Wang,&nbsp;Zhiming Guo,&nbsp;Yan Cai,&nbsp;Dongbo Tu","doi":"10.1177/01466216221084214","DOIUrl":"https://doi.org/10.1177/01466216221084214","url":null,"abstract":"<p><p>The new generation of tests not only focuses on the general ability but also the process of finer-grained skills. Under the guidance of this thought, researchers have developed a dual-purpose CD-CAT (Dual-CAT). In the existing Dual-CAT, the models used in overall ability estimation are unidimensional IRT models, which cannot apply to the multidimensional tests. This article intends to develop a multidimensional Dual-CAT to improve its applicability. To achieve this goal, this article firstly proposes some item selection methods for the multidimensional Dual-CAT, and then verifies the estimation accuracy and exposure rate of these methods through both simulation study and a real item bank study. The results show that the established multidimensional Dual-CAT is effective and the new proposed methods outperform the traditional methods. Finally, this article discusses the future direction of the Dual-CAT.</p>","PeriodicalId":48300,"journal":{"name":"Applied Psychological Measurement","volume":"46 4","pages":"288-302"},"PeriodicalIF":1.2,"publicationDate":"2022-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9118931/pdf/10.1177_01466216221084214.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"9911725","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Detecting Examinees With Item Preknowledge on Real Data. 基于真实数据的项目预知检测。
IF 1.2 4区 心理学
Applied Psychological Measurement Pub Date : 2022-06-01 DOI: 10.1177/01466216221084202
Dmitry I Belov, Sarah L Toton
{"title":"Detecting Examinees With Item Preknowledge on Real Data.","authors":"Dmitry I Belov,&nbsp;Sarah L Toton","doi":"10.1177/01466216221084202","DOIUrl":"https://doi.org/10.1177/01466216221084202","url":null,"abstract":"<p><p>Recently, Belov & Wollack (2021) developed a method for detecting groups of colluding examinees as cliques in a graph. The objective of this article is to study how the performance of their method on real data with item preknowledge (IP) depends on the mechanism of edge formation governed by a response similarity index (RSI). This study resulted in the development of three new RSIs and demonstrated a remarkable advantage of combining responses and response times for detecting examinees with IP. Possible extensions of this study and recommendations for practitioners were formulated.</p>","PeriodicalId":48300,"journal":{"name":"Applied Psychological Measurement","volume":"46 4","pages":"273-287"},"PeriodicalIF":1.2,"publicationDate":"2022-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9118928/pdf/10.1177_01466216221084202.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"9609916","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
The Potential for Interpretational Confounding in Cognitive Diagnosis Models. 认知诊断模型中解释性混淆的可能性。
IF 1 4区 心理学
Applied Psychological Measurement Pub Date : 2022-06-01 Epub Date: 2022-04-15 DOI: 10.1177/01466216221084207
Qi Helen Huang, Daniel M Bolt
{"title":"The Potential for Interpretational Confounding in Cognitive Diagnosis Models.","authors":"Qi Helen Huang, Daniel M Bolt","doi":"10.1177/01466216221084207","DOIUrl":"10.1177/01466216221084207","url":null,"abstract":"<p><p>Binary examinee mastery/nonmastery classifications in cognitive diagnosis models may often be an approximation to proficiencies that are better regarded as continuous. Such misspecification can lead to inconsistencies in the operational definition of \"mastery\" when binary skills models are assumed. In this paper we demonstrate the potential for an interpretational confounding of the latent skills when truly continuous skills are treated as binary. Using the DINA model as an example, we show how such forms of confounding can be observed through item and/or examinee parameter change when (1) different collections of items (such as representing different test forms) previously calibrated separately are subsequently calibrated together; and (2) when structural restrictions are placed on the relationships among skill attributes (such as the assumption of strictly nonnegative growth over time), among other possibilities. We examine these occurrences in both simulation and real data studies. It is suggested that researchers should regularly attend to the potential for interpretational confounding by studying differences in attribute mastery proportions and/or changes in item parameter (e.g., slip and guess) estimates attributable to skill continuity when the same samples of examinees are administered different test forms, or the same test forms are involved in different calibrations.</p>","PeriodicalId":48300,"journal":{"name":"Applied Psychological Measurement","volume":"46 4","pages":"303-320"},"PeriodicalIF":1.0,"publicationDate":"2022-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9118932/pdf/10.1177_01466216221084207.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"9609918","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
A Comparison of Modern and Popular Approaches to Calculating Reliability for Dichotomously Scored Items. 现代与流行的二分计分项目信度计算方法之比较
IF 1 4区 心理学
Applied Psychological Measurement Pub Date : 2022-06-01 Epub Date: 2022-04-14 DOI: 10.1177/01466216221084210
Sébastien Béland, Carl F Falk
{"title":"A Comparison of Modern and Popular Approaches to Calculating Reliability for Dichotomously Scored Items.","authors":"Sébastien Béland, Carl F Falk","doi":"10.1177/01466216221084210","DOIUrl":"10.1177/01466216221084210","url":null,"abstract":"<p><p>Recent work on reliability coefficients has largely focused on continuous items, including critiques of Cronbach's alpha. Although two new model-based reliability coefficients have been proposed for dichotomous items (Dimitrov, 2003a,b; Green & Yang, 2009a), these approaches have yet to be compared to each other or other popular estimates of reliability such as omega, alpha, and the greatest lower bound. We seek computational improvements to one of these model-based reliability coefficients and, in addition, conduct initial Monte Carlo simulations to compare coefficients using dichotomous data. Our results suggest that such improvements to the model-based approach are warranted, while model-based approaches were generally superior.</p>","PeriodicalId":48300,"journal":{"name":"Applied Psychological Measurement","volume":"46 1","pages":"321-337"},"PeriodicalIF":1.0,"publicationDate":"2022-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9118929/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"41659739","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信