{"title":"CS4TE: A Novel Coded Self-Attention and Semantic Synergy Network for Triple Extraction","authors":"Huiyong Lv , Yurong Qian , Jiaying Chen , Shuxiang Hou , Hongyong Leng , Mengnan Ma","doi":"10.1016/j.neucom.2025.130034","DOIUrl":null,"url":null,"abstract":"<div><div>The joint entity relation extraction approach holds great potential for extracting triples from unstructured text. However, in current research, two prevalent shortcomings significantly impact the efficacy of triple extraction task. Firstly, since entities constitute only a small proportion of sentences and token embedding contain a substantial amount of irrelevant information, these factors present significant challenges to the performance of classification models. Secondly, the typical process of predicting triples begins with identifying entities and then predicting triples solely based on the obtained entity representation, this process often overlooks the contextual semantic information associated with the entities. In this work, we propose CS4TE: A Novel Coded Self-Attention and Semantic Synergy Network for Triple Extraction. Specifically, we propose a novel Coded Self-Attention Mechanism designed to refine text representation by effectively masking irrelevant information and enhancing entity representation. Additionally, we propose a Semantic Synergy Network, which innovatively integrates semantic information with token pairs to predict triples, addressing the limitations of previous research that often overlooked semantic information. Finally, our model outperforms state-of-the-art baseline models on two public datasets in the joint entity-relation extraction task, and extensive experiments have been conducted to demonstrate the effectiveness of our method from multiple perspectives.</div></div>","PeriodicalId":19268,"journal":{"name":"Neurocomputing","volume":"636 ","pages":"Article 130034"},"PeriodicalIF":5.5000,"publicationDate":"2025-03-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neurocomputing","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0925231225007064","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
The joint entity relation extraction approach holds great potential for extracting triples from unstructured text. However, in current research, two prevalent shortcomings significantly impact the efficacy of triple extraction task. Firstly, since entities constitute only a small proportion of sentences and token embedding contain a substantial amount of irrelevant information, these factors present significant challenges to the performance of classification models. Secondly, the typical process of predicting triples begins with identifying entities and then predicting triples solely based on the obtained entity representation, this process often overlooks the contextual semantic information associated with the entities. In this work, we propose CS4TE: A Novel Coded Self-Attention and Semantic Synergy Network for Triple Extraction. Specifically, we propose a novel Coded Self-Attention Mechanism designed to refine text representation by effectively masking irrelevant information and enhancing entity representation. Additionally, we propose a Semantic Synergy Network, which innovatively integrates semantic information with token pairs to predict triples, addressing the limitations of previous research that often overlooked semantic information. Finally, our model outperforms state-of-the-art baseline models on two public datasets in the joint entity-relation extraction task, and extensive experiments have been conducted to demonstrate the effectiveness of our method from multiple perspectives.
期刊介绍:
Neurocomputing publishes articles describing recent fundamental contributions in the field of neurocomputing. Neurocomputing theory, practice and applications are the essential topics being covered.