Educational and Psychological Measurement最新文献

筛选
英文 中文
Latent Variable Forests for Latent Variable Score Estimation 用于潜在变量分数估计的潜在变量森林
IF 2.7 3区 心理学
Educational and Psychological Measurement Pub Date : 2024-04-01 DOI: 10.1177/00131644241237502
Franz Classe, Christoph Kern
{"title":"Latent Variable Forests for Latent Variable Score Estimation","authors":"Franz Classe, Christoph Kern","doi":"10.1177/00131644241237502","DOIUrl":"https://doi.org/10.1177/00131644241237502","url":null,"abstract":"We develop a latent variable forest (LV Forest) algorithm for the estimation of latent variable scores with one or more latent variables. LV Forest estimates unbiased latent variable scores based on confirmatory factor analysis (CFA) models with ordinal and/or numerical response variables. Through parametric model restrictions paired with a nonparametric tree-based machine learning approach, LV Forest estimates latent variable scores using models that are unbiased with respect to relevant subgroups in the population. This way, estimated latent variable scores are interpretable with respect to systematic influences of covariates without being biased by these variables. By building a tree ensemble, LV Forest takes parameter heterogeneity in latent variable modeling into account to capture subgroups with both good model fit and stable parameter estimates. We apply LV Forest to simulated data with heterogeneous model parameters as well as to real large-scale survey data. We show that LV Forest improves the accuracy of score estimation if parameter heterogeneity is present.","PeriodicalId":11502,"journal":{"name":"Educational and Psychological Measurement","volume":null,"pages":null},"PeriodicalIF":2.7,"publicationDate":"2024-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140581666","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
The Accuracy of Bayesian Model Fit Indices in Selecting Among Multidimensional Item Response Theory Models. 贝叶斯模型拟合指标在多维项目反应理论模型选择中的准确性
IF 2.1 3区 心理学
Educational and Psychological Measurement Pub Date : 2024-04-01 Epub Date: 2023-05-25 DOI: 10.1177/00131644231165520
Ken A Fujimoto, Carl F Falk
{"title":"The Accuracy of Bayesian Model Fit Indices in Selecting Among Multidimensional Item Response Theory Models.","authors":"Ken A Fujimoto, Carl F Falk","doi":"10.1177/00131644231165520","DOIUrl":"10.1177/00131644231165520","url":null,"abstract":"<p><p>Item response theory (IRT) models are often compared with respect to predictive performance to determine the dimensionality of rating scale data. However, such model comparisons could be biased toward nested-dimensionality IRT models (e.g., the bifactor model) when comparing those models with non-nested-dimensionality IRT models (e.g., a unidimensional or a between-item-dimensionality model). The reason is that, compared with non-nested-dimensionality models, nested-dimensionality models could have a greater propensity to fit data that do not represent a specific dimensional structure. However, it is unclear as to what degree model comparison results are biased toward nested-dimensionality IRT models when the data represent specific dimensional structures and when Bayesian estimation and model comparison indices are used. We conducted a simulation study to add clarity to this issue. We examined the accuracy of four Bayesian predictive performance indices at differentiating among non-nested- and nested-dimensionality IRT models. The deviance information criterion (DIC), a commonly used index to compare Bayesian models, was extremely biased toward nested-dimensionality IRT models, favoring them even when non-nested-dimensionality models were the correct models. The Pareto-smoothed importance sampling approximation of the leave-one-out cross-validation was the least biased, with the Watanabe information criterion and the log-predicted marginal likelihood closely following. The findings demonstrate that nested-dimensionality IRT models are not automatically favored when the data represent specific dimensional structures as long as an appropriate predictive performance index is used.</p>","PeriodicalId":11502,"journal":{"name":"Educational and Psychological Measurement","volume":null,"pages":null},"PeriodicalIF":2.1,"publicationDate":"2024-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11185105/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"48119563","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Dominance Analysis for Latent Variable Models: A Comparison of Methods With Categorical Indicators and Misspecified Models. 潜在变量模型的优势分析:分类指标和未指定模型方法的比较
IF 2.1 3区 心理学
Educational and Psychological Measurement Pub Date : 2024-04-01 Epub Date: 2023-04-28 DOI: 10.1177/00131644231171751
W Holmes Finch
{"title":"Dominance Analysis for Latent Variable Models: A Comparison of Methods With Categorical Indicators and Misspecified Models.","authors":"W Holmes Finch","doi":"10.1177/00131644231171751","DOIUrl":"10.1177/00131644231171751","url":null,"abstract":"<p><p>Dominance analysis (DA) is a very useful tool for ordering independent variables in a regression model based on their relative importance in explaining variance in the dependent variable. This approach, which was originally described by Budescu, has recently been extended to use with structural equation models examining relationships among latent variables. Research demonstrated that this approach yields accurate results for latent variable models involving normally distributed indicator variables and correctly specified models. The purpose of the current simulation study was to compare the use of this DA approach to a method based on observed regression DA and DA when the latent variable model is estimated using two-stage least squares for latent variable models with categorical indicators and/or model misspecification. Results indicated that the DA approach for latent variable models can provide accurate ordering of the variables and correct hypothesis selection when indicators are categorical and models are misspecified. A discussion of implications from this study is provided.</p>","PeriodicalId":11502,"journal":{"name":"Educational and Psychological Measurement","volume":null,"pages":null},"PeriodicalIF":2.1,"publicationDate":"2024-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11185102/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"44118004","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
The Trade-Off Between Factor Score Determinacy and the Preservation of Inter-Factor Correlations. 因子得分确定性与因子间相关性保持之间的权衡
IF 2.1 3区 心理学
Educational and Psychological Measurement Pub Date : 2024-04-01 Epub Date: 2023-04-29 DOI: 10.1177/00131644231171137
André Beauducel, Norbert Hilger, Tobias Kuhl
{"title":"The Trade-Off Between Factor Score Determinacy and the Preservation of Inter-Factor Correlations.","authors":"André Beauducel, Norbert Hilger, Tobias Kuhl","doi":"10.1177/00131644231171137","DOIUrl":"10.1177/00131644231171137","url":null,"abstract":"<p><p>Regression factor score predictors have the maximum factor score determinacy, that is, the maximum correlation with the corresponding factor, but they do not have the same inter-correlations as the factors. As it might be useful to compute factor score predictors that have the same inter-correlations as the factors, correlation-preserving factor score predictors have been proposed. However, correlation-preserving factor score predictors have smaller correlations with the corresponding factors (factor score determinacy) than regression factor score predictors. Thus, higher factor score determinacy goes along with bias of the inter-correlations and unbiased inter-correlations go along with lower factor score determinacy. The aim of the present study was therefore to investigate the size and conditions of the trade-off between factor score determinacy and bias of inter-correlations by means of algebraic considerations and a simulation study. It turns out that under several conditions very small gains of factor score determinacy of the regression factor score predictor go along with a large bias of inter-correlations. Instead of using the regression factor score predictor by default, it is proposed to check whether substantial bias of inter-correlations can be avoided without substantial loss of factor score determinacy using a correlation-preserving factor score predictor. A syntax that allows to compute correlation-preserving factor score predictors from regression factor score predictors, and to compare factor score determinacy and inter-correlations of the factor score predictors is given in the Appendix.</p>","PeriodicalId":11502,"journal":{"name":"Educational and Psychological Measurement","volume":null,"pages":null},"PeriodicalIF":2.1,"publicationDate":"2024-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11185104/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"48879571","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Fused SDT/IRT Models for Mixed-Format Exams 混合格式考试的融合 SDT/IRT 模型
IF 2.7 3区 心理学
Educational and Psychological Measurement Pub Date : 2024-03-28 DOI: 10.1177/00131644241235333
Lawrence T. DeCarlo
{"title":"Fused SDT/IRT Models for Mixed-Format Exams","authors":"Lawrence T. DeCarlo","doi":"10.1177/00131644241235333","DOIUrl":"https://doi.org/10.1177/00131644241235333","url":null,"abstract":"A psychological framework for different types of items commonly used with mixed-format exams is proposed. A choice model based on signal detection theory (SDT) is used for multiple-choice (MC) items, whereas an item response theory (IRT) model is used for open-ended (OE) items. The SDT and IRT models are shown to share a common conceptualization in terms of latent states of “know/don’t know” at the examinee level. This in turn suggests a way to join or “fuse” the models—through the probability of knowing. A general model that fuses the SDT choice model, for MC items, with a generalized sequential logit model, for OE items, is introduced. Fitting SDT and IRT models simultaneously allows one to examine possible differences in psychological processes across the different types of items, to examine the effects of covariates in both models simultaneously, to allow for relations among the model parameters, and likely offers potential estimation benefits. The utility of the approach is illustrated with MC and OE items from large-scale international exams.","PeriodicalId":11502,"journal":{"name":"Educational and Psychological Measurement","volume":null,"pages":null},"PeriodicalIF":2.7,"publicationDate":"2024-03-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140322190","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Examining the Dynamic of Clustering Effects in Multilevel Designs: A Latent Variable Method Application 考察多层次设计中聚类效应的动态:潜变量法的应用
IF 2.7 3区 心理学
Educational and Psychological Measurement Pub Date : 2024-02-21 DOI: 10.1177/00131644241228602
Tenko Raykov, Ahmed Haddadi, Christine DiStefano, Mohammed Alqabbaa
{"title":"Examining the Dynamic of Clustering Effects in Multilevel Designs: A Latent Variable Method Application","authors":"Tenko Raykov, Ahmed Haddadi, Christine DiStefano, Mohammed Alqabbaa","doi":"10.1177/00131644241228602","DOIUrl":"https://doi.org/10.1177/00131644241228602","url":null,"abstract":"This note is concerned with the study of temporal development in several indices reflecting clustering effects in multilevel designs that are frequently utilized in educational and behavioral research. A latent variable method-based approach is outlined, which can be used to point and interval estimate the growth or decline in important functions of level-specific variances in two-level and three-level settings. The procedure may also be employed for the purpose of examining stability over time in clustering effects. The method can be utilized with widely circulated latent variable modeling software, and is illustrated using empirical examples.","PeriodicalId":11502,"journal":{"name":"Educational and Psychological Measurement","volume":null,"pages":null},"PeriodicalIF":2.7,"publicationDate":"2024-02-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139954004","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Evaluating Close Fit in Ordinal Factor Analysis Models With Multiply Imputed Data. 用多输入数据评价有序因子分析模型的紧密拟合
IF 2.7 3区 心理学
Educational and Psychological Measurement Pub Date : 2024-02-01 Epub Date: 2023-03-27 DOI: 10.1177/00131644231158854
Dexin Shi, Bo Zhang, Ren Liu, Zhehan Jiang
{"title":"Evaluating Close Fit in Ordinal Factor Analysis Models With Multiply Imputed Data.","authors":"Dexin Shi, Bo Zhang, Ren Liu, Zhehan Jiang","doi":"10.1177/00131644231158854","DOIUrl":"10.1177/00131644231158854","url":null,"abstract":"<p><p>Multiple imputation (MI) is one of the recommended techniques for handling missing data in ordinal factor analysis models. However, methods for computing MI-based fit indices under ordinal factor analysis models have yet to be developed. In this short note, we introduced the methods of using the standardized root mean squared residual (SRMR) and the root mean square error of approximation (RMSEA) to assess the fit of ordinal factor analysis models with multiply imputed data. Specifically, we described the procedure for computing the MI-based sample estimates and constructing the confidence intervals. Simulation results showed that the proposed methods could yield sufficiently accurate point and interval estimates for both SRMR and RMSEA, especially in conditions with larger sample sizes, less missing data, more response categories, and higher degrees of misfit. Based on the findings, implications and recommendations were discussed.</p>","PeriodicalId":11502,"journal":{"name":"Educational and Psychological Measurement","volume":null,"pages":null},"PeriodicalIF":2.7,"publicationDate":"2024-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10795567/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"48895403","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Are the Steps on Likert Scales Equidistant? Responses on Visual Analog Scales Allow Estimating Their Distances. 李克特量表上的台阶是等距的吗?在视觉模拟尺度上的反应允许估计它们的距离
IF 2.7 3区 心理学
Educational and Psychological Measurement Pub Date : 2024-02-01 Epub Date: 2023-04-04 DOI: 10.1177/00131644231164316
Miguel A García-Pérez
{"title":"Are the Steps on Likert Scales Equidistant? Responses on Visual Analog Scales Allow Estimating Their Distances.","authors":"Miguel A García-Pérez","doi":"10.1177/00131644231164316","DOIUrl":"10.1177/00131644231164316","url":null,"abstract":"<p><p>A recurring question regarding Likert items is whether the discrete steps that this response format allows represent constant increments along the underlying continuum. This question appears unsolvable because Likert responses carry no direct information to this effect. Yet, any item administered in Likert format can identically be administered with a continuous response format such as a visual analog scale (VAS) in which respondents mark a position along a continuous line. Then, the operating characteristics of the item would manifest under both VAS and Likert formats, although perhaps differently as captured by the continuous response model (CRM) and the graded response model (GRM) in item response theory. This article shows that CRM and GRM item parameters hold a formal relation that is mediated by the form in which the continuous dimension is partitioned into intervals to render the discrete Likert responses. Then, CRM and GRM characterizations of the items in a test administered with VAS and Likert formats allow estimating the boundaries of the partition that renders Likert responses for each item and, thus, the distance between consecutive steps. The validity of this approach is first documented via simulation studies. Subsequently, the same approach is used on public data from three personality scales with 12, eight, and six items, respectively. The results indicate the expected correspondence between VAS and Likert responses and reveal unequal distances between successive pairs of Likert steps that also vary greatly across items. Implications for the scoring of Likert items are discussed.</p>","PeriodicalId":11502,"journal":{"name":"Educational and Psychological Measurement","volume":null,"pages":null},"PeriodicalIF":2.7,"publicationDate":"2024-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10795572/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"48541417","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Equating Oral Reading Fluency Scores: A Model-Based Approach. 一种基于模型的口语阅读流利度评分等值方法
IF 2.7 3区 心理学
Educational and Psychological Measurement Pub Date : 2024-02-01 Epub Date: 2023-01-05 DOI: 10.1177/00131644221148122
Yusuf Kara, Akihito Kamata, Xin Qiao, Cornelis J Potgieter, Joseph F T Nese
{"title":"Equating Oral Reading Fluency Scores: A Model-Based Approach.","authors":"Yusuf Kara, Akihito Kamata, Xin Qiao, Cornelis J Potgieter, Joseph F T Nese","doi":"10.1177/00131644221148122","DOIUrl":"10.1177/00131644221148122","url":null,"abstract":"<p><p>Words read correctly per minute (WCPM) is the reporting score metric in oral reading fluency (ORF) assessments, which is popularly utilized as part of curriculum-based measurements to screen at-risk readers and to monitor progress of students who receive interventions. Just like other types of assessments with multiple forms, equating would be necessary when WCPM scores are obtained from multiple ORF passages to be compared both between and within students. This article proposes a model-based approach for equating WCPM scores. A simulation study was conducted to evaluate the performance of the model-based equating approach along with some observed-score equating methods with external anchor test design.</p>","PeriodicalId":11502,"journal":{"name":"Educational and Psychological Measurement","volume":null,"pages":null},"PeriodicalIF":2.7,"publicationDate":"2024-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10795571/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"41418799","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Artificial Neural Networks for Short-Form Development of Psychometric Tests: A Study on Synthetic Populations Using Autoencoders. 用于心理测量测试短期开发的人工神经网络:使用自动编码器对合成群体的研究
IF 2.7 3区 心理学
Educational and Psychological Measurement Pub Date : 2024-02-01 Epub Date: 2023-04-15 DOI: 10.1177/00131644231164363
Monica Casella, Pasquale Dolce, Michela Ponticorvo, Nicola Milano, Davide Marocco
{"title":"Artificial Neural Networks for Short-Form Development of Psychometric Tests: A Study on Synthetic Populations Using Autoencoders.","authors":"Monica Casella, Pasquale Dolce, Michela Ponticorvo, Nicola Milano, Davide Marocco","doi":"10.1177/00131644231164363","DOIUrl":"10.1177/00131644231164363","url":null,"abstract":"<p><p>Short-form development is an important topic in psychometric research, which requires researchers to face methodological choices at different steps. The statistical techniques traditionally used for shortening tests, which belong to the so-called exploratory model, make assumptions not always verified in psychological data. This article proposes a machine learning-based autonomous procedure for short-form development that combines explanatory and predictive techniques in an integrative approach. The study investigates the item-selection performance of two autoencoders: a particular type of artificial neural network that is comparable to principal component analysis. The procedure is tested on artificial data simulated from a factor-based population and is compared with existent computational approaches to develop short forms. Autoencoders require mild assumptions on data characteristics and provide a method to predict long-form items' responses from the short form. Indeed, results show that they can help the researcher to develop a short form by automatically selecting a subset of items that better reconstruct the original item's responses and that preserve the internal structure of the long-form.</p>","PeriodicalId":11502,"journal":{"name":"Educational and Psychological Measurement","volume":null,"pages":null},"PeriodicalIF":2.7,"publicationDate":"2024-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10795568/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"41603099","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信