{"title":"促成心理学的可再现性危机:统计软件选择在因素分析中的作用","authors":"Stefan C. Dombrowski","doi":"10.1016/j.jsp.2025.101462","DOIUrl":null,"url":null,"abstract":"<div><div>A potentially overlooked contributor to the reproducibility crisis in psychology is the choice of statistical application software used for factor analysis. Although the open science movement promotes transparency by advocating for open access to data and statistical methods, this approach alone is insufficient to address the reproducibility crisis. It is commonly assumed that different statistical software applications produce equivalent results when conducting the same statistical analysis. However, this is not necessarily the case. Statistical programs often yield disparate outcomes, even when using identical data and factor analytic procedures, which can lead to inconsistent interpretation of results. This study examines this phenomenon by conducting exploratory factor analyses on two tests of cognitive ability—the WISC-V and the MEZURE—using four different statistical programs/applications. Factor analysis plays a critical role in determining the underlying theory of cognitive ability instruments, and guides how those instruments should be scored and interpreted. However, psychology is grappling with a reproducibility crisis in this area, as independent researchers and test publishers frequently report divergent factor analytic results. The outcome of this study revealed significant variations in structural outcomes among the statistical software programs/applications. These findings highlight the importance of using multiple statistical programs, ensuring transparency with analysis code, and recognizing the potential for varied outcomes when interpreting results from factor analytic procedures. Addressing these issues is important for advancing scientific integrity and mitigating the reproducibility crisis in psychology particularly in relation to cognitive ability structural validity.</div></div>","PeriodicalId":48232,"journal":{"name":"Journal of School Psychology","volume":"110 ","pages":"Article 101462"},"PeriodicalIF":3.8000,"publicationDate":"2025-04-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Contributing to the reproducibility crisis in Psychology: The role of statistical software choice on factor analysis\",\"authors\":\"Stefan C. Dombrowski\",\"doi\":\"10.1016/j.jsp.2025.101462\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>A potentially overlooked contributor to the reproducibility crisis in psychology is the choice of statistical application software used for factor analysis. Although the open science movement promotes transparency by advocating for open access to data and statistical methods, this approach alone is insufficient to address the reproducibility crisis. It is commonly assumed that different statistical software applications produce equivalent results when conducting the same statistical analysis. However, this is not necessarily the case. Statistical programs often yield disparate outcomes, even when using identical data and factor analytic procedures, which can lead to inconsistent interpretation of results. This study examines this phenomenon by conducting exploratory factor analyses on two tests of cognitive ability—the WISC-V and the MEZURE—using four different statistical programs/applications. Factor analysis plays a critical role in determining the underlying theory of cognitive ability instruments, and guides how those instruments should be scored and interpreted. However, psychology is grappling with a reproducibility crisis in this area, as independent researchers and test publishers frequently report divergent factor analytic results. The outcome of this study revealed significant variations in structural outcomes among the statistical software programs/applications. These findings highlight the importance of using multiple statistical programs, ensuring transparency with analysis code, and recognizing the potential for varied outcomes when interpreting results from factor analytic procedures. Addressing these issues is important for advancing scientific integrity and mitigating the reproducibility crisis in psychology particularly in relation to cognitive ability structural validity.</div></div>\",\"PeriodicalId\":48232,\"journal\":{\"name\":\"Journal of School Psychology\",\"volume\":\"110 \",\"pages\":\"Article 101462\"},\"PeriodicalIF\":3.8000,\"publicationDate\":\"2025-04-22\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of School Psychology\",\"FirstCategoryId\":\"102\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0022440525000354\",\"RegionNum\":1,\"RegionCategory\":\"心理学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"PSYCHOLOGY, SOCIAL\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of School Psychology","FirstCategoryId":"102","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0022440525000354","RegionNum":1,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"PSYCHOLOGY, SOCIAL","Score":null,"Total":0}
Contributing to the reproducibility crisis in Psychology: The role of statistical software choice on factor analysis
A potentially overlooked contributor to the reproducibility crisis in psychology is the choice of statistical application software used for factor analysis. Although the open science movement promotes transparency by advocating for open access to data and statistical methods, this approach alone is insufficient to address the reproducibility crisis. It is commonly assumed that different statistical software applications produce equivalent results when conducting the same statistical analysis. However, this is not necessarily the case. Statistical programs often yield disparate outcomes, even when using identical data and factor analytic procedures, which can lead to inconsistent interpretation of results. This study examines this phenomenon by conducting exploratory factor analyses on two tests of cognitive ability—the WISC-V and the MEZURE—using four different statistical programs/applications. Factor analysis plays a critical role in determining the underlying theory of cognitive ability instruments, and guides how those instruments should be scored and interpreted. However, psychology is grappling with a reproducibility crisis in this area, as independent researchers and test publishers frequently report divergent factor analytic results. The outcome of this study revealed significant variations in structural outcomes among the statistical software programs/applications. These findings highlight the importance of using multiple statistical programs, ensuring transparency with analysis code, and recognizing the potential for varied outcomes when interpreting results from factor analytic procedures. Addressing these issues is important for advancing scientific integrity and mitigating the reproducibility crisis in psychology particularly in relation to cognitive ability structural validity.
期刊介绍:
The Journal of School Psychology publishes original empirical articles and critical reviews of the literature on research and practices relevant to psychological and behavioral processes in school settings. JSP presents research on intervention mechanisms and approaches; schooling effects on the development of social, cognitive, mental-health, and achievement-related outcomes; assessment; and consultation. Submissions from a variety of disciplines are encouraged. All manuscripts are read by the Editor and one or more editorial consultants with the intent of providing appropriate and constructive written reviews.