Entity Relationship Extraction Method Based on Multi-head Attention and Graph Convolutional Network

Q3 Arts and Humanities
Icon Pub Date : 2023-03-01 DOI:10.1109/ICNLP58431.2023.00060
Sheping Zhai, Hang Li, Fangyi Li, Xinnian Kang
{"title":"Entity Relationship Extraction Method Based on Multi-head Attention and Graph Convolutional Network","authors":"Sheping Zhai, Hang Li, Fangyi Li, Xinnian Kang","doi":"10.1109/ICNLP58431.2023.00060","DOIUrl":null,"url":null,"abstract":"Extracting entities and relations from text is crucial in the field of natural language processing. Current methods for relation extraction rely on training sets labeled using remote supervision techniques. However, these methods have limitations as they do not consider the connection between entity and relation extraction and cannot extract overlapping entities and relations. Therefore, accurate joint entity-relation extraction remains challenging. Our paper introduces a model for entity relation extraction based on multi-head attention and graph convolutional networks. We utilize the multi-head attention approach to extract entity features, building on the text features extracted by the graph convolutional network. Utilizing the New York Times (NYT) dataset, we evaluated the performance of our model. The experimentation revealed that our model effectively captures the semantic correlation between entity and relation extraction and minimizes the impact of unrelated entity pairings, resulting in improved recognition accuracy even in scenarios with overlapping entities.","PeriodicalId":53637,"journal":{"name":"Icon","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2023-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Icon","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICNLP58431.2023.00060","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"Arts and Humanities","Score":null,"Total":0}
引用次数: 0

Abstract

Extracting entities and relations from text is crucial in the field of natural language processing. Current methods for relation extraction rely on training sets labeled using remote supervision techniques. However, these methods have limitations as they do not consider the connection between entity and relation extraction and cannot extract overlapping entities and relations. Therefore, accurate joint entity-relation extraction remains challenging. Our paper introduces a model for entity relation extraction based on multi-head attention and graph convolutional networks. We utilize the multi-head attention approach to extract entity features, building on the text features extracted by the graph convolutional network. Utilizing the New York Times (NYT) dataset, we evaluated the performance of our model. The experimentation revealed that our model effectively captures the semantic correlation between entity and relation extraction and minimizes the impact of unrelated entity pairings, resulting in improved recognition accuracy even in scenarios with overlapping entities.
基于多头注意和图卷积网络的实体关系提取方法
从文本中提取实体和关系是自然语言处理领域的关键。当前的关系提取方法依赖于使用远程监督技术标记的训练集。但是,这些方法没有考虑实体和关系提取之间的联系,无法提取重叠的实体和关系,存在一定的局限性。因此,准确地提取关节实体关系仍然是一个挑战。本文提出了一种基于多头注意和图卷积网络的实体关系抽取模型。我们在图卷积网络提取的文本特征的基础上,利用多头注意方法提取实体特征。利用纽约时报(NYT)数据集,我们评估了模型的性能。实验表明,我们的模型有效地捕获了实体之间的语义相关性和关系提取,并最大限度地减少了不相关实体配对的影响,即使在实体重叠的情况下也能提高识别精度。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Icon
Icon Arts and Humanities-History and Philosophy of Science
CiteScore
0.30
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信