Wenjie Sun;Chengke Wu;Qinge Xiao;Junjie Jiang;Yuanjun Guo;Ying Bi;Xinyu Wu;Zhile Yang
{"title":"CauseTerML: Causal Learning via Term Mining for Assessing Review Discrepancies","authors":"Wenjie Sun;Chengke Wu;Qinge Xiao;Junjie Jiang;Yuanjun Guo;Ying Bi;Xinyu Wu;Zhile Yang","doi":"10.1109/TAI.2024.3512500","DOIUrl":null,"url":null,"abstract":"Innovation is a key driver of modern economic and technological development. Correct and equitable identification of innovation is essential for promoting market competitiveness and ensuring the optimal allocation of resources. Existing research on innovation evaluation mainly focuses on qualitative or quantitative evaluation of the results, while ignoring potential biases in the application process. This work investigates an unexplored issue in the field of innovation evaluation: Whether the technicality of the title of an application affects its degree of attention in the review process? The key lies in two aspects: how to evaluate the technicality of the title and how to quantify this effect. To achieve this goal, we combine the term extraction schemes and causal inference techniques by modelling the fairness detection task in a causal diagram, and propose a novel framework called CauseTerML. The framework can be applied to fairness detection in a variety of application scenarios. Extensive experiments on a real-world patent dataset validate the effectiveness of CauseTerML.","PeriodicalId":73305,"journal":{"name":"IEEE transactions on artificial intelligence","volume":"6 5","pages":"1156-1170"},"PeriodicalIF":0.0000,"publicationDate":"2024-12-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on artificial intelligence","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10782990/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Innovation is a key driver of modern economic and technological development. Correct and equitable identification of innovation is essential for promoting market competitiveness and ensuring the optimal allocation of resources. Existing research on innovation evaluation mainly focuses on qualitative or quantitative evaluation of the results, while ignoring potential biases in the application process. This work investigates an unexplored issue in the field of innovation evaluation: Whether the technicality of the title of an application affects its degree of attention in the review process? The key lies in two aspects: how to evaluate the technicality of the title and how to quantify this effect. To achieve this goal, we combine the term extraction schemes and causal inference techniques by modelling the fairness detection task in a causal diagram, and propose a novel framework called CauseTerML. The framework can be applied to fairness detection in a variety of application scenarios. Extensive experiments on a real-world patent dataset validate the effectiveness of CauseTerML.