促成心理学的可再现性危机:统计软件选择在因素分析中的作用

IF 3.8 1区 心理学 Q1 PSYCHOLOGY, SOCIAL
Stefan C. Dombrowski
{"title":"促成心理学的可再现性危机:统计软件选择在因素分析中的作用","authors":"Stefan C. Dombrowski","doi":"10.1016/j.jsp.2025.101462","DOIUrl":null,"url":null,"abstract":"<div><div>A potentially overlooked contributor to the reproducibility crisis in psychology is the choice of statistical application software used for factor analysis. Although the open science movement promotes transparency by advocating for open access to data and statistical methods, this approach alone is insufficient to address the reproducibility crisis. It is commonly assumed that different statistical software applications produce equivalent results when conducting the same statistical analysis. However, this is not necessarily the case. Statistical programs often yield disparate outcomes, even when using identical data and factor analytic procedures, which can lead to inconsistent interpretation of results. This study examines this phenomenon by conducting exploratory factor analyses on two tests of cognitive ability—the WISC-V and the MEZURE—using four different statistical programs/applications. Factor analysis plays a critical role in determining the underlying theory of cognitive ability instruments, and guides how those instruments should be scored and interpreted. However, psychology is grappling with a reproducibility crisis in this area, as independent researchers and test publishers frequently report divergent factor analytic results. The outcome of this study revealed significant variations in structural outcomes among the statistical software programs/applications. These findings highlight the importance of using multiple statistical programs, ensuring transparency with analysis code, and recognizing the potential for varied outcomes when interpreting results from factor analytic procedures. Addressing these issues is important for advancing scientific integrity and mitigating the reproducibility crisis in psychology particularly in relation to cognitive ability structural validity.</div></div>","PeriodicalId":48232,"journal":{"name":"Journal of School Psychology","volume":"110 ","pages":"Article 101462"},"PeriodicalIF":3.8000,"publicationDate":"2025-04-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Contributing to the reproducibility crisis in Psychology: The role of statistical software choice on factor analysis\",\"authors\":\"Stefan C. Dombrowski\",\"doi\":\"10.1016/j.jsp.2025.101462\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>A potentially overlooked contributor to the reproducibility crisis in psychology is the choice of statistical application software used for factor analysis. Although the open science movement promotes transparency by advocating for open access to data and statistical methods, this approach alone is insufficient to address the reproducibility crisis. It is commonly assumed that different statistical software applications produce equivalent results when conducting the same statistical analysis. However, this is not necessarily the case. Statistical programs often yield disparate outcomes, even when using identical data and factor analytic procedures, which can lead to inconsistent interpretation of results. This study examines this phenomenon by conducting exploratory factor analyses on two tests of cognitive ability—the WISC-V and the MEZURE—using four different statistical programs/applications. Factor analysis plays a critical role in determining the underlying theory of cognitive ability instruments, and guides how those instruments should be scored and interpreted. However, psychology is grappling with a reproducibility crisis in this area, as independent researchers and test publishers frequently report divergent factor analytic results. The outcome of this study revealed significant variations in structural outcomes among the statistical software programs/applications. These findings highlight the importance of using multiple statistical programs, ensuring transparency with analysis code, and recognizing the potential for varied outcomes when interpreting results from factor analytic procedures. Addressing these issues is important for advancing scientific integrity and mitigating the reproducibility crisis in psychology particularly in relation to cognitive ability structural validity.</div></div>\",\"PeriodicalId\":48232,\"journal\":{\"name\":\"Journal of School Psychology\",\"volume\":\"110 \",\"pages\":\"Article 101462\"},\"PeriodicalIF\":3.8000,\"publicationDate\":\"2025-04-22\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of School Psychology\",\"FirstCategoryId\":\"102\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0022440525000354\",\"RegionNum\":1,\"RegionCategory\":\"心理学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"PSYCHOLOGY, SOCIAL\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of School Psychology","FirstCategoryId":"102","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0022440525000354","RegionNum":1,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"PSYCHOLOGY, SOCIAL","Score":null,"Total":0}
引用次数: 0

摘要

心理学中可重复性危机的一个潜在被忽视的因素是用于因素分析的统计应用软件的选择。尽管开放科学运动通过倡导开放获取数据和统计方法来促进透明度,但仅靠这种方法不足以解决可重复性危机。通常认为,在进行相同的统计分析时,不同的统计软件应用程序会产生相同的结果。然而,事实并非如此。即使使用相同的数据和因素分析程序,统计程序也经常产生不同的结果,这可能导致对结果的不一致解释。本研究采用四种不同的统计程序/应用程序,对WISC-V和mezure两项认知能力测试进行探索性因素分析,探讨了这一现象。因素分析在确定认知能力工具的基本理论方面起着至关重要的作用,并指导如何对这些工具进行评分和解释。然而,心理学正在努力解决这一领域的可重复性危机,因为独立研究人员和测试出版商经常报告不同的因素分析结果。本研究的结果揭示了统计软件程序/应用程序在结构结果上的显著差异。这些发现强调了使用多种统计程序的重要性,确保分析代码的透明度,以及在解释因子分析程序的结果时认识到不同结果的潜力。解决这些问题对于促进科学完整性和减轻心理学中的可重复性危机非常重要,特别是在认知能力结构有效性方面。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Contributing to the reproducibility crisis in Psychology: The role of statistical software choice on factor analysis
A potentially overlooked contributor to the reproducibility crisis in psychology is the choice of statistical application software used for factor analysis. Although the open science movement promotes transparency by advocating for open access to data and statistical methods, this approach alone is insufficient to address the reproducibility crisis. It is commonly assumed that different statistical software applications produce equivalent results when conducting the same statistical analysis. However, this is not necessarily the case. Statistical programs often yield disparate outcomes, even when using identical data and factor analytic procedures, which can lead to inconsistent interpretation of results. This study examines this phenomenon by conducting exploratory factor analyses on two tests of cognitive ability—the WISC-V and the MEZURE—using four different statistical programs/applications. Factor analysis plays a critical role in determining the underlying theory of cognitive ability instruments, and guides how those instruments should be scored and interpreted. However, psychology is grappling with a reproducibility crisis in this area, as independent researchers and test publishers frequently report divergent factor analytic results. The outcome of this study revealed significant variations in structural outcomes among the statistical software programs/applications. These findings highlight the importance of using multiple statistical programs, ensuring transparency with analysis code, and recognizing the potential for varied outcomes when interpreting results from factor analytic procedures. Addressing these issues is important for advancing scientific integrity and mitigating the reproducibility crisis in psychology particularly in relation to cognitive ability structural validity.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Journal of School Psychology
Journal of School Psychology PSYCHOLOGY, EDUCATIONAL-
CiteScore
6.70
自引率
8.00%
发文量
71
期刊介绍: The Journal of School Psychology publishes original empirical articles and critical reviews of the literature on research and practices relevant to psychological and behavioral processes in school settings. JSP presents research on intervention mechanisms and approaches; schooling effects on the development of social, cognitive, mental-health, and achievement-related outcomes; assessment; and consultation. Submissions from a variety of disciplines are encouraged. All manuscripts are read by the Editor and one or more editorial consultants with the intent of providing appropriate and constructive written reviews.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信