Assessing the process reproducibility of meta-analyses published in the top 20 pathology journals: A cross-sectional study.

IF 1.9 4区 医学 Q2 PATHOLOGY
Griffin Hughes, Cameron Barton, Matt Vassar
{"title":"Assessing the process reproducibility of meta-analyses published in the top 20 pathology journals: A cross-sectional study.","authors":"Griffin Hughes, Cameron Barton, Matt Vassar","doi":"10.1093/ajcp/aqaf103","DOIUrl":null,"url":null,"abstract":"<p><strong>Objective: </strong>The objective of this study is to investigate the rigor of reporting and the potential for process reproducibility of meta-analyses published within top pathology journals.</p><p><strong>Methods: </strong>This cross-sectional, meta-research study assessed eligible systematic reviews with meta-analysis indexed in MEDLINE through PubMed. We included those studies that were published within the top 20 pathology journals (h-5 index) from inception to March 21, 2024. We extracted proper reporting variables across 4 key quantitative synthesis domains: (1) primary study eligibility, (2) search strategy, (3) screening and extraction methods, and (4) quantitative synthesis approach.</p><p><strong>Results: </strong>We found 282 studies eligible for masked duplicate data extraction. Less than half of studies (40.8% ± 2.9%) reported whether unpublished literature was eligible for inclusion, while less than 20% reported the date of their database search (18.8% ± 2.3%). Similarly, less than 20% reported a full, reproducible search strategy (19.1% ± 2.3%). Not all studies reported primary study effects (92.9% ± 1.5%). The reported use or mention of a relevant synthesis reporting guideline was associated with significant improvement in reporting of search factors (P < .001) and screening factors (P < .001). Nine meta-analyses (9 of 282; 3.2%) were deemed potentially process-reproducible.</p><p><strong>Conclusions: </strong>Fewer than 10 meta-analyses from top pathology journals were potentially process-reproducible without reasonable effort. Most individual summary estimates were reproducible due to the presence of forest plots. Nevertheless, reproducibility factors related to search strategies are the single largest hindrance to reproducible meta-analyses published within our sample.</p>","PeriodicalId":7506,"journal":{"name":"American journal of clinical pathology","volume":" ","pages":""},"PeriodicalIF":1.9000,"publicationDate":"2025-10-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"American journal of clinical pathology","FirstCategoryId":"3","ListUrlMain":"https://doi.org/10.1093/ajcp/aqaf103","RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"PATHOLOGY","Score":null,"Total":0}
引用次数: 0

Abstract

Objective: The objective of this study is to investigate the rigor of reporting and the potential for process reproducibility of meta-analyses published within top pathology journals.

Methods: This cross-sectional, meta-research study assessed eligible systematic reviews with meta-analysis indexed in MEDLINE through PubMed. We included those studies that were published within the top 20 pathology journals (h-5 index) from inception to March 21, 2024. We extracted proper reporting variables across 4 key quantitative synthesis domains: (1) primary study eligibility, (2) search strategy, (3) screening and extraction methods, and (4) quantitative synthesis approach.

Results: We found 282 studies eligible for masked duplicate data extraction. Less than half of studies (40.8% ± 2.9%) reported whether unpublished literature was eligible for inclusion, while less than 20% reported the date of their database search (18.8% ± 2.3%). Similarly, less than 20% reported a full, reproducible search strategy (19.1% ± 2.3%). Not all studies reported primary study effects (92.9% ± 1.5%). The reported use or mention of a relevant synthesis reporting guideline was associated with significant improvement in reporting of search factors (P < .001) and screening factors (P < .001). Nine meta-analyses (9 of 282; 3.2%) were deemed potentially process-reproducible.

Conclusions: Fewer than 10 meta-analyses from top pathology journals were potentially process-reproducible without reasonable effort. Most individual summary estimates were reproducible due to the presence of forest plots. Nevertheless, reproducibility factors related to search strategies are the single largest hindrance to reproducible meta-analyses published within our sample.

评估前20名病理学期刊上发表的荟萃分析的过程可重复性:一项横断面研究。
目的:本研究的目的是调查发表在顶级病理学期刊上的荟萃分析报告的严谨性和过程可重复性的潜力。方法:这项横断面的meta研究评估了通过PubMed在MEDLINE索引的meta分析的合格系统综述。我们纳入了从创刊到2024年3月21日在病理学期刊(h-5指数)排名前20位的研究。我们在4个关键的定量合成领域中提取了适当的报告变量:(1)主要研究资格,(2)搜索策略,(3)筛选和提取方法,以及(4)定量合成方法。结果:我们发现有282项研究符合屏蔽重复数据提取的条件。不到一半的研究(40.8%±2.9%)报告了未发表文献是否符合纳入条件,不到20%的研究报告了数据库检索的日期(18.8%±2.3%)。同样,不到20%的人报告了一个完整的、可重复的搜索策略(19.1%±2.3%)。并非所有研究都报告了主要研究效应(92.9%±1.5%)。报告中使用或提及相关的综合报告指南与搜索因子报告的显著改善相关(P结论:来自顶级病理学期刊的荟萃分析中,只有不到10个在没有合理努力的情况下具有潜在的过程可重复性。由于森林样地的存在,大多数个体的汇总估计值是可重复的。然而,与搜索策略相关的可重复性因素是在我们的样本中发表可重复性荟萃分析的最大障碍。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
7.70
自引率
2.90%
发文量
367
审稿时长
3-6 weeks
期刊介绍: The American Journal of Clinical Pathology (AJCP) is the official journal of the American Society for Clinical Pathology and the Academy of Clinical Laboratory Physicians and Scientists. It is a leading international journal for publication of articles concerning novel anatomic pathology and laboratory medicine observations on human disease. AJCP emphasizes articles that focus on the application of evolving technologies for the diagnosis and characterization of diseases and conditions, as well as those that have a direct link toward improving patient care.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信