Examining the replicability of online experiments selected by a decision market

IF 21.4 1区 心理学 Q1 MULTIDISCIPLINARY SCIENCES
Felix Holzmeister, Magnus Johannesson, Colin F. Camerer, Yiling Chen, Teck-Hua Ho, Suzanne Hoogeveen, Juergen Huber, Noriko Imai, Taisuke Imai, Lawrence Jin, Michael Kirchler, Alexander Ly, Benjamin Mandl, Dylan Manfredi, Gideon Nave, Brian A. Nosek, Thomas Pfeiffer, Alexandra Sarafoglou, Rene Schwaiger, Eric-Jan Wagenmakers, Viking Waldén, Anna Dreber
{"title":"Examining the replicability of online experiments selected by a decision market","authors":"Felix Holzmeister, Magnus Johannesson, Colin F. Camerer, Yiling Chen, Teck-Hua Ho, Suzanne Hoogeveen, Juergen Huber, Noriko Imai, Taisuke Imai, Lawrence Jin, Michael Kirchler, Alexander Ly, Benjamin Mandl, Dylan Manfredi, Gideon Nave, Brian A. Nosek, Thomas Pfeiffer, Alexandra Sarafoglou, Rene Schwaiger, Eric-Jan Wagenmakers, Viking Waldén, Anna Dreber","doi":"10.1038/s41562-024-02062-9","DOIUrl":null,"url":null,"abstract":"<p>Here we test the feasibility of using decision markets to select studies for replication and provide evidence about the replicability of online experiments. Social scientists (<i>n</i> = 162) traded on the outcome of close replications of 41 systematically selected MTurk social science experiments published in PNAS 2015–2018, knowing that the 12 studies with the lowest and the 12 with the highest final market prices would be selected for replication, along with 2 randomly selected studies. The replication rate, based on the statistical significance indicator, was 83% for the top-12 and 33% for the bottom-12 group. Overall, 54% of the studies were successfully replicated, with replication effect size estimates averaging 45% of the original effect size estimates. The replication rate varied between 54% and 62% for alternative replication indicators. The observed replicability of MTurk experiments is comparable to that of previous systematic replication projects involving laboratory experiments.</p>","PeriodicalId":19074,"journal":{"name":"Nature Human Behaviour","volume":"1 1","pages":""},"PeriodicalIF":21.4000,"publicationDate":"2024-11-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Nature Human Behaviour","FirstCategoryId":"102","ListUrlMain":"https://doi.org/10.1038/s41562-024-02062-9","RegionNum":1,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MULTIDISCIPLINARY SCIENCES","Score":null,"Total":0}
引用次数: 0

Abstract

Here we test the feasibility of using decision markets to select studies for replication and provide evidence about the replicability of online experiments. Social scientists (n = 162) traded on the outcome of close replications of 41 systematically selected MTurk social science experiments published in PNAS 2015–2018, knowing that the 12 studies with the lowest and the 12 with the highest final market prices would be selected for replication, along with 2 randomly selected studies. The replication rate, based on the statistical significance indicator, was 83% for the top-12 and 33% for the bottom-12 group. Overall, 54% of the studies were successfully replicated, with replication effect size estimates averaging 45% of the original effect size estimates. The replication rate varied between 54% and 62% for alternative replication indicators. The observed replicability of MTurk experiments is comparable to that of previous systematic replication projects involving laboratory experiments.

Abstract Image

检验由决策市场选择的在线实验的可复制性
在此,我们测试了利用决策市场选择复制研究的可行性,并为在线实验的可复制性提供了证据。社会科学家(n = 162)对《美国国家科学院院刊》(PNAS)2015-2018 年发表的 41 项系统选择的 MTurk 社会科学实验的近似复制结果进行了交易,他们知道最终市场价格最低的 12 项研究和最高的 12 项研究将与随机选择的 2 项研究一起被选中进行复制。根据统计显著性指标,前 12 名组的复制率为 83%,后 12 名组的复制率为 33%。总体而言,有 54% 的研究成功进行了复制,复制效果估计值平均为原始效果估计值的 45%。其他复制指标的复制率介于 54% 和 62% 之间。所观察到的 MTurk 实验的可复制性与之前涉及实验室实验的系统复制项目相当。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Nature Human Behaviour
Nature Human Behaviour Psychology-Social Psychology
CiteScore
36.80
自引率
1.00%
发文量
227
期刊介绍: Nature Human Behaviour is a journal that focuses on publishing research of outstanding significance into any aspect of human behavior.The research can cover various areas such as psychological, biological, and social bases of human behavior.It also includes the study of origins, development, and disorders related to human behavior.The primary aim of the journal is to increase the visibility of research in the field and enhance its societal reach and impact.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信