Kylie E Hunter, Mason Aberoumand, Sol Libesman, James X Sotiropoulos, Jonathan G Williams, Jannik Aagerup, Rui Wang, Ben W Mol, Wentao Li, Angie Barba, Nipun Shrestha, Angela C Webster, Anna Lene Seidler
{"title":"用于评估随机试验完整性的个人参与者数据完整性工具。","authors":"Kylie E Hunter, Mason Aberoumand, Sol Libesman, James X Sotiropoulos, Jonathan G Williams, Jannik Aagerup, Rui Wang, Ben W Mol, Wentao Li, Angie Barba, Nipun Shrestha, Angela C Webster, Anna Lene Seidler","doi":"10.1002/jrsm.1738","DOIUrl":null,"url":null,"abstract":"<p><p>Increasing concerns about the trustworthiness of research have prompted calls to scrutinise studies' Individual Participant Data (IPD), but guidance on how to do this was lacking. To address this, we developed the IPD Integrity Tool to screen randomised controlled trials (RCTs) for integrity issues. Development of the tool involved a literature review, consultation with an expert advisory group, piloting on two IPD meta-analyses (including 73 trials with IPD), preliminary validation on 13 datasets with and without known integrity issues, and evaluation to inform iterative refinements. The IPD Integrity Tool comprises 31 items (13 study-level, 18 IPD-specific). IPD-specific items are automated where possible, and are grouped into eight domains, including unusual data patterns, baseline characteristics, correlations, date violations, patterns of allocation, internal and external inconsistencies, and plausibility of data. Users rate each item as having either no issues, some/minor issue(s), or many/major issue(s) according to decision rules, and justification for each rating is recorded. Overall, the tool guides decision-making by determining whether a trial has no concerns, some concerns requiring further information, or major concerns warranting exclusion from evidence synthesis or publication. In our preliminary validation checks, the tool accurately identified all five studies with known integrity issues. The IPD Integrity Tool enables users to assess the integrity of RCTs via examination of IPD. The tool may be applied by evidence synthesists, editors and others to determine whether an RCT should be considered sufficiently trustworthy to contribute to the evidence base that informs policy and practice.</p>","PeriodicalId":226,"journal":{"name":"Research Synthesis Methods","volume":null,"pages":null},"PeriodicalIF":5.0000,"publicationDate":"2024-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"The Individual Participant Data Integrity Tool for assessing the integrity of randomised trials.\",\"authors\":\"Kylie E Hunter, Mason Aberoumand, Sol Libesman, James X Sotiropoulos, Jonathan G Williams, Jannik Aagerup, Rui Wang, Ben W Mol, Wentao Li, Angie Barba, Nipun Shrestha, Angela C Webster, Anna Lene Seidler\",\"doi\":\"10.1002/jrsm.1738\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>Increasing concerns about the trustworthiness of research have prompted calls to scrutinise studies' Individual Participant Data (IPD), but guidance on how to do this was lacking. To address this, we developed the IPD Integrity Tool to screen randomised controlled trials (RCTs) for integrity issues. Development of the tool involved a literature review, consultation with an expert advisory group, piloting on two IPD meta-analyses (including 73 trials with IPD), preliminary validation on 13 datasets with and without known integrity issues, and evaluation to inform iterative refinements. The IPD Integrity Tool comprises 31 items (13 study-level, 18 IPD-specific). IPD-specific items are automated where possible, and are grouped into eight domains, including unusual data patterns, baseline characteristics, correlations, date violations, patterns of allocation, internal and external inconsistencies, and plausibility of data. Users rate each item as having either no issues, some/minor issue(s), or many/major issue(s) according to decision rules, and justification for each rating is recorded. Overall, the tool guides decision-making by determining whether a trial has no concerns, some concerns requiring further information, or major concerns warranting exclusion from evidence synthesis or publication. In our preliminary validation checks, the tool accurately identified all five studies with known integrity issues. The IPD Integrity Tool enables users to assess the integrity of RCTs via examination of IPD. The tool may be applied by evidence synthesists, editors and others to determine whether an RCT should be considered sufficiently trustworthy to contribute to the evidence base that informs policy and practice.</p>\",\"PeriodicalId\":226,\"journal\":{\"name\":\"Research Synthesis Methods\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":5.0000,\"publicationDate\":\"2024-11-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Research Synthesis Methods\",\"FirstCategoryId\":\"99\",\"ListUrlMain\":\"https://doi.org/10.1002/jrsm.1738\",\"RegionNum\":2,\"RegionCategory\":\"生物学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"2024/8/13 0:00:00\",\"PubModel\":\"Epub\",\"JCR\":\"Q1\",\"JCRName\":\"MATHEMATICAL & COMPUTATIONAL BIOLOGY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Research Synthesis Methods","FirstCategoryId":"99","ListUrlMain":"https://doi.org/10.1002/jrsm.1738","RegionNum":2,"RegionCategory":"生物学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2024/8/13 0:00:00","PubModel":"Epub","JCR":"Q1","JCRName":"MATHEMATICAL & COMPUTATIONAL BIOLOGY","Score":null,"Total":0}
The Individual Participant Data Integrity Tool for assessing the integrity of randomised trials.
Increasing concerns about the trustworthiness of research have prompted calls to scrutinise studies' Individual Participant Data (IPD), but guidance on how to do this was lacking. To address this, we developed the IPD Integrity Tool to screen randomised controlled trials (RCTs) for integrity issues. Development of the tool involved a literature review, consultation with an expert advisory group, piloting on two IPD meta-analyses (including 73 trials with IPD), preliminary validation on 13 datasets with and without known integrity issues, and evaluation to inform iterative refinements. The IPD Integrity Tool comprises 31 items (13 study-level, 18 IPD-specific). IPD-specific items are automated where possible, and are grouped into eight domains, including unusual data patterns, baseline characteristics, correlations, date violations, patterns of allocation, internal and external inconsistencies, and plausibility of data. Users rate each item as having either no issues, some/minor issue(s), or many/major issue(s) according to decision rules, and justification for each rating is recorded. Overall, the tool guides decision-making by determining whether a trial has no concerns, some concerns requiring further information, or major concerns warranting exclusion from evidence synthesis or publication. In our preliminary validation checks, the tool accurately identified all five studies with known integrity issues. The IPD Integrity Tool enables users to assess the integrity of RCTs via examination of IPD. The tool may be applied by evidence synthesists, editors and others to determine whether an RCT should be considered sufficiently trustworthy to contribute to the evidence base that informs policy and practice.
期刊介绍:
Research Synthesis Methods is a reputable, peer-reviewed journal that focuses on the development and dissemination of methods for conducting systematic research synthesis. Our aim is to advance the knowledge and application of research synthesis methods across various disciplines.
Our journal provides a platform for the exchange of ideas and knowledge related to designing, conducting, analyzing, interpreting, reporting, and applying research synthesis. While research synthesis is commonly practiced in the health and social sciences, our journal also welcomes contributions from other fields to enrich the methodologies employed in research synthesis across scientific disciplines.
By bridging different disciplines, we aim to foster collaboration and cross-fertilization of ideas, ultimately enhancing the quality and effectiveness of research synthesis methods. Whether you are a researcher, practitioner, or stakeholder involved in research synthesis, our journal strives to offer valuable insights and practical guidance for your work.