{"title":"Assessing database accuracy for article retractions: A preliminary study comparing Retraction Watch Database, PubMed, and Web of Science.","authors":"Paul Sebo, Melissa Sebo","doi":"10.1080/08989621.2025.2465621","DOIUrl":null,"url":null,"abstract":"<p><strong>Objective: </strong>This study aimed to compare the accuracy of metadata for retracted articles in Retraction Watch Database (RWD), PubMed, and Web of Science (WoS).</p><p><strong>Methods: </strong>Twenty general internal medicine journals with an impact factor > 2 were randomly selected. RWD, PubMed, and WoS were used to retrieve all retracted articles published in these journals. Eight metadata variables were examined: journal, title, type of article, author(s), country/countries of affiliation, year of publication, year of retraction, and reason(s) for retraction (assessed only for RWD, as this information was unavailable in PubMed and WoS). Descriptive analyses were conducted to document errors across databases.</p><p><strong>Results: </strong>Thirty-five retractions were identified, and 280 metadata entries (35 × 8) were analyzed. RWD contained the most metadata errors, affecting 16 articles and 20 metadata entries, including seven errors in year of publication, six in article type, six in author names (five misspellings, one missing two authors), and one in country of affiliation. WoS had one error (a missing author), and PubMed had none.</p><p><strong>Conclusion: </strong>The relatively high error rate in RWD suggests that researchers should cross-check metadata across multiple databases. Given the preliminary nature of this study, larger-scale research is needed to confirm these findings and improve metadata reliability in retraction databases.</p>","PeriodicalId":50927,"journal":{"name":"Accountability in Research-Policies and Quality Assurance","volume":" ","pages":"1-18"},"PeriodicalIF":2.8000,"publicationDate":"2025-02-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Accountability in Research-Policies and Quality Assurance","FirstCategoryId":"98","ListUrlMain":"https://doi.org/10.1080/08989621.2025.2465621","RegionNum":1,"RegionCategory":"哲学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MEDICAL ETHICS","Score":null,"Total":0}
引用次数: 0
Abstract
Objective: This study aimed to compare the accuracy of metadata for retracted articles in Retraction Watch Database (RWD), PubMed, and Web of Science (WoS).
Methods: Twenty general internal medicine journals with an impact factor > 2 were randomly selected. RWD, PubMed, and WoS were used to retrieve all retracted articles published in these journals. Eight metadata variables were examined: journal, title, type of article, author(s), country/countries of affiliation, year of publication, year of retraction, and reason(s) for retraction (assessed only for RWD, as this information was unavailable in PubMed and WoS). Descriptive analyses were conducted to document errors across databases.
Results: Thirty-five retractions were identified, and 280 metadata entries (35 × 8) were analyzed. RWD contained the most metadata errors, affecting 16 articles and 20 metadata entries, including seven errors in year of publication, six in article type, six in author names (five misspellings, one missing two authors), and one in country of affiliation. WoS had one error (a missing author), and PubMed had none.
Conclusion: The relatively high error rate in RWD suggests that researchers should cross-check metadata across multiple databases. Given the preliminary nature of this study, larger-scale research is needed to confirm these findings and improve metadata reliability in retraction databases.
期刊介绍:
Accountability in Research: Policies and Quality Assurance is devoted to the examination and critical analysis of systems for maximizing integrity in the conduct of research. It provides an interdisciplinary, international forum for the development of ethics, procedures, standards policies, and concepts to encourage the ethical conduct of research and to enhance the validity of research results.
The journal welcomes views on advancing the integrity of research in the fields of general and multidisciplinary sciences, medicine, law, economics, statistics, management studies, public policy, politics, sociology, history, psychology, philosophy, ethics, and information science.
All submitted manuscripts are subject to initial appraisal by the Editor, and if found suitable for further consideration, to peer review by independent, anonymous expert referees.