Miklos Bognar, Marton A Varga, Don van Ravenzwaaij, Zoltan Kekecs, James A Grange, Mate Gyurkovics, Balazs Aczel
{"title":"Multiverse simulation to explore the impact of analytical choices on type I and type II errors in a reaction time study.","authors":"Miklos Bognar, Marton A Varga, Don van Ravenzwaaij, Zoltan Kekecs, James A Grange, Mate Gyurkovics, Balazs Aczel","doi":"10.3758/s13428-025-02807-y","DOIUrl":null,"url":null,"abstract":"<p><p>Researcher degrees of freedom in data analysis present significant challenges in social sciences, where different analytical decisions can lead to varying conclusions. In this work, we propose an example of an exploratory multiverse simulation to empirically compare various decision pathways to identify an effect's sensitivity to different analytical choices. The approach is demonstrated on the congruency sequence effect (CSE), a well-studied phenomenon in cognitive control research. We reviewed existing literature to identify common non-theory-specific analytical decisions, such as outlier exclusion criteria and hypothesis testing methods, and incorporated these into our simulation framework. Using 20,000 simulated datasets, we compared the true positive rates (TPR) and false positive rates (FPR) across 50 different decision pathways, resulting in a total of 1 million analyses. Our results indicate substantial differences in power and type I error rates across the analytical pathways, with some posing a significant risk of producing high false positives. The findings underscore the importance of running extensive simulations to investigate different data handling and hypothesis testing approaches in certain research fields. This case study serves as an example for conducting similar simulation procedures in research fields characterized by high variability in analytical decisions when investigating an otherwise widely accepted effect.</p>","PeriodicalId":8717,"journal":{"name":"Behavior Research Methods","volume":"57 10","pages":"291"},"PeriodicalIF":3.9000,"publicationDate":"2025-09-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12446153/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Behavior Research Methods","FirstCategoryId":"102","ListUrlMain":"https://doi.org/10.3758/s13428-025-02807-y","RegionNum":2,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"PSYCHOLOGY, EXPERIMENTAL","Score":null,"Total":0}
引用次数: 0
Abstract
Researcher degrees of freedom in data analysis present significant challenges in social sciences, where different analytical decisions can lead to varying conclusions. In this work, we propose an example of an exploratory multiverse simulation to empirically compare various decision pathways to identify an effect's sensitivity to different analytical choices. The approach is demonstrated on the congruency sequence effect (CSE), a well-studied phenomenon in cognitive control research. We reviewed existing literature to identify common non-theory-specific analytical decisions, such as outlier exclusion criteria and hypothesis testing methods, and incorporated these into our simulation framework. Using 20,000 simulated datasets, we compared the true positive rates (TPR) and false positive rates (FPR) across 50 different decision pathways, resulting in a total of 1 million analyses. Our results indicate substantial differences in power and type I error rates across the analytical pathways, with some posing a significant risk of producing high false positives. The findings underscore the importance of running extensive simulations to investigate different data handling and hypothesis testing approaches in certain research fields. This case study serves as an example for conducting similar simulation procedures in research fields characterized by high variability in analytical decisions when investigating an otherwise widely accepted effect.
期刊介绍:
Behavior Research Methods publishes articles concerned with the methods, techniques, and instrumentation of research in experimental psychology. The journal focuses particularly on the use of computer technology in psychological research. An annual special issue is devoted to this field.