{"title":"Content-based quality evaluation of scientific papers using coarse feature and knowledge entity network","authors":"Zhongyi Wang , Haoxuan Zhang , Haihua Chen , Yunhe Feng , Junhua Ding","doi":"10.1016/j.jksuci.2024.102119","DOIUrl":null,"url":null,"abstract":"<div><p>Pre-evaluating scientific paper quality aids in alleviating peer review pressure and fostering scientific advancement. Although prior studies have identified numerous quality-related features, their effectiveness and representativeness of paper content remain to be comprehensively investigated. Addressing this issue, we propose a content-based interpretable method for pre-evaluating the quality of scientific papers. Firstly, we define quality attributes of computer science (CS) papers as <em>integrity</em>, <em>clarity</em>, <em>novelty</em>, and <em>significance</em>, based on peer review criteria from 11 top-tier CS conferences. We formulate the problem as two classification tasks: <em>Accepted/Disputed/Rejected</em> (ADR) and <em>Accepted/Rejected</em> (AR). Subsequently, we construct fine-grained features from metadata and knowledge entity networks, including text structure, readability, references, citations, semantic novelty, and network structure. We empirically evaluate our method using the ICLR paper dataset, achieving optimal performance with the Random Forest model, yielding F1 scores of 0.715 and 0.762 for the two tasks, respectively. Through feature analysis and case studies employing SHAP interpretable methods, we demonstrate that the proposed features enhance the performance of machine learning models in scientific paper quality evaluation, offering interpretable evidence for model decisions.</p></div>","PeriodicalId":48547,"journal":{"name":"Journal of King Saud University-Computer and Information Sciences","volume":null,"pages":null},"PeriodicalIF":5.2000,"publicationDate":"2024-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S1319157824002088/pdfft?md5=8465c32ce03880aedb7757f4009be933&pid=1-s2.0-S1319157824002088-main.pdf","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of King Saud University-Computer and Information Sciences","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1319157824002088","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
引用次数: 0
Abstract
Pre-evaluating scientific paper quality aids in alleviating peer review pressure and fostering scientific advancement. Although prior studies have identified numerous quality-related features, their effectiveness and representativeness of paper content remain to be comprehensively investigated. Addressing this issue, we propose a content-based interpretable method for pre-evaluating the quality of scientific papers. Firstly, we define quality attributes of computer science (CS) papers as integrity, clarity, novelty, and significance, based on peer review criteria from 11 top-tier CS conferences. We formulate the problem as two classification tasks: Accepted/Disputed/Rejected (ADR) and Accepted/Rejected (AR). Subsequently, we construct fine-grained features from metadata and knowledge entity networks, including text structure, readability, references, citations, semantic novelty, and network structure. We empirically evaluate our method using the ICLR paper dataset, achieving optimal performance with the Random Forest model, yielding F1 scores of 0.715 and 0.762 for the two tasks, respectively. Through feature analysis and case studies employing SHAP interpretable methods, we demonstrate that the proposed features enhance the performance of machine learning models in scientific paper quality evaluation, offering interpretable evidence for model decisions.
期刊介绍:
In 2022 the Journal of King Saud University - Computer and Information Sciences will become an author paid open access journal. Authors who submit their manuscript after October 31st 2021 will be asked to pay an Article Processing Charge (APC) after acceptance of their paper to make their work immediately, permanently, and freely accessible to all. The Journal of King Saud University Computer and Information Sciences is a refereed, international journal that covers all aspects of both foundations of computer and its practical applications.