Luis Eduardo Garrido, Alexander P Christensen, Hudson Golino, Agustín Martínez-Molina, Víctor B Arias, Kiero Guerra-Peña, María Dolores Nieto-Cañaveras, Flávio Azevedo, Francisco J Abad
{"title":"A Systematic Evaluation of Wording Effects Modeling Under the Exploratory Structural Equation Modeling Framework.","authors":"Luis Eduardo Garrido, Alexander P Christensen, Hudson Golino, Agustín Martínez-Molina, Víctor B Arias, Kiero Guerra-Peña, María Dolores Nieto-Cañaveras, Flávio Azevedo, Francisco J Abad","doi":"10.1080/00273171.2025.2545362","DOIUrl":null,"url":null,"abstract":"<p><p>Wording effects, the systematic method variance arising from the inconsistent responding to positively and negatively worded items of the same construct, are pervasive in the behavioral and health sciences. Although several factor modeling strategies have been proposed to mitigate their adverse effects, there is limited systematic research assessing their performance with exploratory structural equation models (ESEM). The present study evaluated the impact of different types of response bias related to wording effects (random and straight-line carelessness, acquiescence, item difficulty, and mixed) on ESEM models incorporating two popular method modeling strategies, the correlated traits-correlated methods minus one (CTC[M-1]) model and random intercept item factor analysis (RIIFA), as well as the \"do nothing\" approach. Five variables were manipulated using Monte Carlo methods: the type and magnitude of response bias, factor loadings, factor correlations, and sample size. Overall, the results showed that ignoring wording effects leads to poor model fit and serious distortions of the ESEM estimates. The RIIFA approach generally performed best at countering these adverse impacts and recovering unbiased factor structures, whereas the CTC(M-1) models struggled when biases affected both positively and negatively worded items. Our findings also indicated that method factors can sometimes reflect or absorb substantive variance, which may blur their associations with external variables and complicate their interpretation when embedded in broader structural models. A straightforward guide is offered to applied researchers who wish to use ESEM with mixed-worded scales.</p>","PeriodicalId":53155,"journal":{"name":"Multivariate Behavioral Research","volume":" ","pages":"1-30"},"PeriodicalIF":3.5000,"publicationDate":"2025-09-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Multivariate Behavioral Research","FirstCategoryId":"102","ListUrlMain":"https://doi.org/10.1080/00273171.2025.2545362","RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MATHEMATICS, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
引用次数: 0
Abstract
Wording effects, the systematic method variance arising from the inconsistent responding to positively and negatively worded items of the same construct, are pervasive in the behavioral and health sciences. Although several factor modeling strategies have been proposed to mitigate their adverse effects, there is limited systematic research assessing their performance with exploratory structural equation models (ESEM). The present study evaluated the impact of different types of response bias related to wording effects (random and straight-line carelessness, acquiescence, item difficulty, and mixed) on ESEM models incorporating two popular method modeling strategies, the correlated traits-correlated methods minus one (CTC[M-1]) model and random intercept item factor analysis (RIIFA), as well as the "do nothing" approach. Five variables were manipulated using Monte Carlo methods: the type and magnitude of response bias, factor loadings, factor correlations, and sample size. Overall, the results showed that ignoring wording effects leads to poor model fit and serious distortions of the ESEM estimates. The RIIFA approach generally performed best at countering these adverse impacts and recovering unbiased factor structures, whereas the CTC(M-1) models struggled when biases affected both positively and negatively worded items. Our findings also indicated that method factors can sometimes reflect or absorb substantive variance, which may blur their associations with external variables and complicate their interpretation when embedded in broader structural models. A straightforward guide is offered to applied researchers who wish to use ESEM with mixed-worded scales.
期刊介绍:
Multivariate Behavioral Research (MBR) publishes a variety of substantive, methodological, and theoretical articles in all areas of the social and behavioral sciences. Most MBR articles fall into one of two categories. Substantive articles report on applications of sophisticated multivariate research methods to study topics of substantive interest in personality, health, intelligence, industrial/organizational, and other behavioral science areas. Methodological articles present and/or evaluate new developments in multivariate methods, or address methodological issues in current research. We also encourage submission of integrative articles related to pedagogy involving multivariate research methods, and to historical treatments of interest and relevance to multivariate research methods.