Research on relation extraction model of overlapping entity based on attention mechanism

Ling Gan, Xiaobin Liu
{"title":"Research on relation extraction model of overlapping entity based on attention mechanism","authors":"Ling Gan, Xiaobin Liu","doi":"10.1117/12.2674559","DOIUrl":null,"url":null,"abstract":"Relation extraction refers to get the triple structure composed of semantic relation entity pairs from unstructured text, which is an important part of tasks such as knowledge graphs. At present, the joint extraction model is in common used to avoid the impact of overlapping entities, but there are the following problems. First, the dependencies between text words are not fully considered, and the recognition performance of entities with long spans is low. Insufficient utilization of information makes it difficult to fully extract implicit relationships. In order to address these issues, this text proposes an improved joint learning model, which builds text semantic representation through BERT pre-training, obtains relation type representation as an additional mapping through a multi-label classification method, and sequentially uses multi-layer BiLSTM combined with highway network to obtain semantic information, and combine The attention mechanism obtains the entity location score, and the pointer network is used to obtain the entity location. The experiments of this method on the common dataset of relation extraction task is effective.","PeriodicalId":286364,"journal":{"name":"Conference on Computer Graphics, Artificial Intelligence, and Data Processing","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-05-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Conference on Computer Graphics, Artificial Intelligence, and Data Processing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1117/12.2674559","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Relation extraction refers to get the triple structure composed of semantic relation entity pairs from unstructured text, which is an important part of tasks such as knowledge graphs. At present, the joint extraction model is in common used to avoid the impact of overlapping entities, but there are the following problems. First, the dependencies between text words are not fully considered, and the recognition performance of entities with long spans is low. Insufficient utilization of information makes it difficult to fully extract implicit relationships. In order to address these issues, this text proposes an improved joint learning model, which builds text semantic representation through BERT pre-training, obtains relation type representation as an additional mapping through a multi-label classification method, and sequentially uses multi-layer BiLSTM combined with highway network to obtain semantic information, and combine The attention mechanism obtains the entity location score, and the pointer network is used to obtain the entity location. The experiments of this method on the common dataset of relation extraction task is effective.
基于注意机制的重叠实体关系提取模型研究
关系抽取是指从非结构化文本中提取由语义关系实体对组成的三重结构,是知识图谱等任务的重要组成部分。目前,为了避免实体重叠的影响,常用的是联合抽取模型,但存在以下问题。首先,没有充分考虑文本词之间的依赖关系,对长跨度实体的识别性能较低。信息利用不充分,难以充分提取隐含关系。为了解决这些问题,本文提出了一种改进的联合学习模型,该模型通过BERT预训练构建文本语义表示,通过多标签分类方法获得关系类型表示作为附加映射,并顺序使用多层BiLSTM结合公路网获取语义信息,并结合注意机制获得实体位置评分,使用指针网络获得实体位置。该方法在关系抽取任务的公共数据集上的实验是有效的。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信