元分析研究水平统计能力计算指南

IF 15.6 1区 心理学 Q1 PSYCHOLOGY
Daniel S. Quintana
{"title":"元分析研究水平统计能力计算指南","authors":"Daniel S. Quintana","doi":"10.1177/25152459221147260","DOIUrl":null,"url":null,"abstract":"Meta-analysis is a popular approach in the psychological sciences for synthesizing data across studies. However, the credibility of meta-analysis outcomes depends on the evidential value of studies included in the body of evidence used for data synthesis. One important consideration for determining a study’s evidential value is the statistical power of the study’s design/statistical test combination for detecting hypothetical effect sizes of interest. Studies with a design/test combination that cannot reliably detect a wide range of effect sizes are more susceptible to questionable research practices and exaggerated effect sizes. Therefore, determining the statistical power for design/test combinations for studies included in meta-analyses can help researchers make decisions regarding confidence in the body of evidence. Because the one true population effect size is unknown when hypothesis testing, an alternative approach is to determine statistical power for a range of hypothetical effect sizes. This tutorial introduces the metameta R package and web app, which facilitates the straightforward calculation and visualization of study-level statistical power in meta-analyses for a range of hypothetical effect sizes. Readers will be shown how to reanalyze data using information typically presented in meta-analysis forest plots or tables and how to integrate the metameta package when reporting novel meta-analyses. A step-by-step companion screencast video tutorial is also provided to assist readers using the R package.","PeriodicalId":55645,"journal":{"name":"Advances in Methods and Practices in Psychological Science","volume":" ","pages":""},"PeriodicalIF":15.6000,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"8","resultStr":"{\"title\":\"A Guide for Calculating Study-Level Statistical Power for Meta-Analyses\",\"authors\":\"Daniel S. Quintana\",\"doi\":\"10.1177/25152459221147260\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Meta-analysis is a popular approach in the psychological sciences for synthesizing data across studies. However, the credibility of meta-analysis outcomes depends on the evidential value of studies included in the body of evidence used for data synthesis. One important consideration for determining a study’s evidential value is the statistical power of the study’s design/statistical test combination for detecting hypothetical effect sizes of interest. Studies with a design/test combination that cannot reliably detect a wide range of effect sizes are more susceptible to questionable research practices and exaggerated effect sizes. Therefore, determining the statistical power for design/test combinations for studies included in meta-analyses can help researchers make decisions regarding confidence in the body of evidence. Because the one true population effect size is unknown when hypothesis testing, an alternative approach is to determine statistical power for a range of hypothetical effect sizes. This tutorial introduces the metameta R package and web app, which facilitates the straightforward calculation and visualization of study-level statistical power in meta-analyses for a range of hypothetical effect sizes. Readers will be shown how to reanalyze data using information typically presented in meta-analysis forest plots or tables and how to integrate the metameta package when reporting novel meta-analyses. A step-by-step companion screencast video tutorial is also provided to assist readers using the R package.\",\"PeriodicalId\":55645,\"journal\":{\"name\":\"Advances in Methods and Practices in Psychological Science\",\"volume\":\" \",\"pages\":\"\"},\"PeriodicalIF\":15.6000,\"publicationDate\":\"2023-01-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"8\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Advances in Methods and Practices in Psychological Science\",\"FirstCategoryId\":\"102\",\"ListUrlMain\":\"https://doi.org/10.1177/25152459221147260\",\"RegionNum\":1,\"RegionCategory\":\"心理学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"PSYCHOLOGY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Advances in Methods and Practices in Psychological Science","FirstCategoryId":"102","ListUrlMain":"https://doi.org/10.1177/25152459221147260","RegionNum":1,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"PSYCHOLOGY","Score":null,"Total":0}
引用次数: 8

摘要

元分析是心理科学中一种流行的综合研究数据的方法。然而,荟萃分析结果的可信度取决于用于数据综合的证据体系中包含的研究的证据价值。确定研究证据价值的一个重要考虑因素是研究设计/统计测试组合的统计能力,用于检测感兴趣的假设效应大小。设计/测试组合不能可靠地检测出广泛的效应大小的研究更容易受到可疑研究实践和夸大效应大小的影响。因此,确定荟萃分析中研究的设计/测试组合的统计能力可以帮助研究人员就证据的可信度做出决定。因为在假设检验时,一个真实的群体效应大小是未知的,所以另一种方法是确定一系列假设效应大小的统计幂。本教程介绍了metameta R软件包和web应用程序,它有助于在一系列假设效应大小的荟萃分析中直接计算和可视化研究水平的统计能力。读者将了解如何使用荟萃分析森林图或表格中通常提供的信息重新分析数据,以及在报告新的荟萃分析时如何集成元数据包。还提供了一个循序渐进的伴随屏幕播放的视频教程,以帮助读者使用R包。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
A Guide for Calculating Study-Level Statistical Power for Meta-Analyses
Meta-analysis is a popular approach in the psychological sciences for synthesizing data across studies. However, the credibility of meta-analysis outcomes depends on the evidential value of studies included in the body of evidence used for data synthesis. One important consideration for determining a study’s evidential value is the statistical power of the study’s design/statistical test combination for detecting hypothetical effect sizes of interest. Studies with a design/test combination that cannot reliably detect a wide range of effect sizes are more susceptible to questionable research practices and exaggerated effect sizes. Therefore, determining the statistical power for design/test combinations for studies included in meta-analyses can help researchers make decisions regarding confidence in the body of evidence. Because the one true population effect size is unknown when hypothesis testing, an alternative approach is to determine statistical power for a range of hypothetical effect sizes. This tutorial introduces the metameta R package and web app, which facilitates the straightforward calculation and visualization of study-level statistical power in meta-analyses for a range of hypothetical effect sizes. Readers will be shown how to reanalyze data using information typically presented in meta-analysis forest plots or tables and how to integrate the metameta package when reporting novel meta-analyses. A step-by-step companion screencast video tutorial is also provided to assist readers using the R package.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
21.20
自引率
0.70%
发文量
16
期刊介绍: In 2021, Advances in Methods and Practices in Psychological Science will undergo a transition to become an open access journal. This journal focuses on publishing innovative developments in research methods, practices, and conduct within the field of psychological science. It embraces a wide range of areas and topics and encourages the integration of methodological and analytical questions. The aim of AMPPS is to bring the latest methodological advances to researchers from various disciplines, even those who are not methodological experts. Therefore, the journal seeks submissions that are accessible to readers with different research interests and that represent the diverse research trends within the field of psychological science. The types of content that AMPPS welcomes include articles that communicate advancements in methods, practices, and metascience, as well as empirical scientific best practices. Additionally, tutorials, commentaries, and simulation studies on new techniques and research tools are encouraged. The journal also aims to publish papers that bring advances from specialized subfields to a broader audience. Lastly, AMPPS accepts Registered Replication Reports, which focus on replicating important findings from previously published studies. Overall, the transition of Advances in Methods and Practices in Psychological Science to an open access journal aims to increase accessibility and promote the dissemination of new developments in research methods and practices within the field of psychological science.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信