基于句法结构表示学习的语义关系提取

Nguyen Éric, Tsuruoka Ari
{"title":"基于句法结构表示学习的语义关系提取","authors":"Nguyen Éric, Tsuruoka Ari","doi":"10.21203/rs.3.rs-3593929/v1","DOIUrl":null,"url":null,"abstract":"Abstract Leveraging distant supervision for relation extraction has emerged as a robust method to harness large text corpora, widely adopted to unearth new relational facts from unstructured text. Prevailing neural approaches have made significant strides in relation extraction by representing sentences in compact, low-dimensional vectors. However, the incorporation of syntactic nuances when modeling entities remains underexplored. Our study introduces a novel method for crafting syntax-aware entity embeddings to boost neural relation extraction. We start by encoding entity contexts within dependency trees through tree-GRU to generate sentence-level entity embeddings. We then apply both intra-sentence and inter-sentence attention mechanisms to distill entity embeddings at the sentence set level, considering every occurrence of the pertinent entity pair. The culmination of our methodology is the fusion of sentence and entity embeddings for relation classification. Our experiments on a benchmark dataset indicate that our approach harnesses the full potential of informative instances, thereby setting new benchmarks for relation extraction performance.","PeriodicalId":500086,"journal":{"name":"Research Square (Research Square)","volume":"9 11","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-11-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Semantic Relational Extraction via Learning Syntactic Structural Representation\",\"authors\":\"Nguyen Éric, Tsuruoka Ari\",\"doi\":\"10.21203/rs.3.rs-3593929/v1\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Abstract Leveraging distant supervision for relation extraction has emerged as a robust method to harness large text corpora, widely adopted to unearth new relational facts from unstructured text. Prevailing neural approaches have made significant strides in relation extraction by representing sentences in compact, low-dimensional vectors. However, the incorporation of syntactic nuances when modeling entities remains underexplored. Our study introduces a novel method for crafting syntax-aware entity embeddings to boost neural relation extraction. We start by encoding entity contexts within dependency trees through tree-GRU to generate sentence-level entity embeddings. We then apply both intra-sentence and inter-sentence attention mechanisms to distill entity embeddings at the sentence set level, considering every occurrence of the pertinent entity pair. The culmination of our methodology is the fusion of sentence and entity embeddings for relation classification. Our experiments on a benchmark dataset indicate that our approach harnesses the full potential of informative instances, thereby setting new benchmarks for relation extraction performance.\",\"PeriodicalId\":500086,\"journal\":{\"name\":\"Research Square (Research Square)\",\"volume\":\"9 11\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-11-15\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Research Square (Research Square)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.21203/rs.3.rs-3593929/v1\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Research Square (Research Square)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.21203/rs.3.rs-3593929/v1","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

利用远程监督进行关系提取已经成为一种利用大型文本语料库的强大方法,广泛用于从非结构化文本中发现新的关系事实。当前的神经方法通过在紧凑的低维向量中表示句子,在关系提取方面取得了重大进展。然而,在实体建模时结合语法的细微差别仍然没有得到充分的研究。我们的研究引入了一种新的方法来制作语法感知实体嵌入,以提高神经关系的提取。我们首先通过tree-GRU在依赖树中编码实体上下文,以生成句子级实体嵌入。然后,考虑到相关实体对的每一次出现,我们应用句子内和句子间的注意机制来提取句子集级别的实体嵌入。我们的方法的高潮是句子和实体嵌入的融合,用于关系分类。我们在基准数据集上的实验表明,我们的方法利用了信息实例的全部潜力,从而为关系提取性能设置了新的基准。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Semantic Relational Extraction via Learning Syntactic Structural Representation
Abstract Leveraging distant supervision for relation extraction has emerged as a robust method to harness large text corpora, widely adopted to unearth new relational facts from unstructured text. Prevailing neural approaches have made significant strides in relation extraction by representing sentences in compact, low-dimensional vectors. However, the incorporation of syntactic nuances when modeling entities remains underexplored. Our study introduces a novel method for crafting syntax-aware entity embeddings to boost neural relation extraction. We start by encoding entity contexts within dependency trees through tree-GRU to generate sentence-level entity embeddings. We then apply both intra-sentence and inter-sentence attention mechanisms to distill entity embeddings at the sentence set level, considering every occurrence of the pertinent entity pair. The culmination of our methodology is the fusion of sentence and entity embeddings for relation classification. Our experiments on a benchmark dataset indicate that our approach harnesses the full potential of informative instances, thereby setting new benchmarks for relation extraction performance.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信