{"title":"Multiple dependence representation of attention graph convolutional network relation extraction model","authors":"Zhao Liangfu, Xiong Yujie, Gao Yongbin, Yu Wenjun","doi":"10.1049/cps2.12080","DOIUrl":null,"url":null,"abstract":"<p>Dependency analysis can better help neural network to capture semantic features in sentences, so as to extract entity relation. Currently, hard pruning strategies and soft pruning strategies based on dependency tree structure coding have been proposed to balance beneficial additional information and adverse interference in extraction tasks. A new model based on graph convolutional networks, which uses a variety of representations describing dependency trees from different perspectives and combining these representations to obtain a better sentence representation for relation classification is proposed. A newly defined module is added, and this module uses the attention mechanism to capture deeper semantic features from the context representation as the global semantic features of the input text, thus helping the model to capture deeper semantic information at the sentence level for relational extraction tasks. In order to get more information about a given entity pair from the input sentence, the authors also model implicit co-references (references) to entities. This model can extract semantic features related to the relationship between entities from sentences to the maximum extent. The results show that the model in this paper achieves good results on SemEval2010-Task8 and KBP37 datasets.</p>","PeriodicalId":36881,"journal":{"name":"IET Cyber-Physical Systems: Theory and Applications","volume":"9 3","pages":"247-257"},"PeriodicalIF":1.7000,"publicationDate":"2023-10-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1049/cps2.12080","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IET Cyber-Physical Systems: Theory and Applications","FirstCategoryId":"1085","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1049/cps2.12080","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
引用次数: 0
Abstract
Dependency analysis can better help neural network to capture semantic features in sentences, so as to extract entity relation. Currently, hard pruning strategies and soft pruning strategies based on dependency tree structure coding have been proposed to balance beneficial additional information and adverse interference in extraction tasks. A new model based on graph convolutional networks, which uses a variety of representations describing dependency trees from different perspectives and combining these representations to obtain a better sentence representation for relation classification is proposed. A newly defined module is added, and this module uses the attention mechanism to capture deeper semantic features from the context representation as the global semantic features of the input text, thus helping the model to capture deeper semantic information at the sentence level for relational extraction tasks. In order to get more information about a given entity pair from the input sentence, the authors also model implicit co-references (references) to entities. This model can extract semantic features related to the relationship between entities from sentences to the maximum extent. The results show that the model in this paper achieves good results on SemEval2010-Task8 and KBP37 datasets.