基于实体关系抽取的权力知识图谱构建

Qiong-lan Na, Dan Su, Jiaojiao Zhang, Xin Li, Na Xiao
{"title":"基于实体关系抽取的权力知识图谱构建","authors":"Qiong-lan Na, Dan Su, Jiaojiao Zhang, Xin Li, Na Xiao","doi":"10.1109/IIP57348.2022.00022","DOIUrl":null,"url":null,"abstract":"In order to integrate the fragmented text data in the power domain and solve the problems of disordered and weak correlation of transmission protocols, an improved BERT model was proposed by combining deep learning and knowledge graph for entity relationship extraction in the power domain. This method uses the BERT model based on a full word mask to generate sentence vectors, word vectors with contextual semantics, and then takes the average value of word vectors to get entity vectors. The sentence vectors and entity vectors are combined by the attention machine. Finally, the combined new vectors are put into a fully layer for sequential labeling and finding the optimal tag to implement the entity extracted object. The experimental results show that the precision, recall value, and F1 score of this method are 90.12%, 85.25%, and 87.56 % respectively when entity extraction is performed on the corpus data set of transmission procedures.","PeriodicalId":412907,"journal":{"name":"2022 4th International Conference on Intelligent Information Processing (IIP)","volume":"25 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Construction of Power Knowledge Graph based on Entity Relation Extraction\",\"authors\":\"Qiong-lan Na, Dan Su, Jiaojiao Zhang, Xin Li, Na Xiao\",\"doi\":\"10.1109/IIP57348.2022.00022\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In order to integrate the fragmented text data in the power domain and solve the problems of disordered and weak correlation of transmission protocols, an improved BERT model was proposed by combining deep learning and knowledge graph for entity relationship extraction in the power domain. This method uses the BERT model based on a full word mask to generate sentence vectors, word vectors with contextual semantics, and then takes the average value of word vectors to get entity vectors. The sentence vectors and entity vectors are combined by the attention machine. Finally, the combined new vectors are put into a fully layer for sequential labeling and finding the optimal tag to implement the entity extracted object. The experimental results show that the precision, recall value, and F1 score of this method are 90.12%, 85.25%, and 87.56 % respectively when entity extraction is performed on the corpus data set of transmission procedures.\",\"PeriodicalId\":412907,\"journal\":{\"name\":\"2022 4th International Conference on Intelligent Information Processing (IIP)\",\"volume\":\"25 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-10-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 4th International Conference on Intelligent Information Processing (IIP)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IIP57348.2022.00022\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 4th International Conference on Intelligent Information Processing (IIP)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IIP57348.2022.00022","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

为了整合功率域中的碎片文本数据,解决传输协议的无序性和弱相关性问题,将深度学习与知识图相结合,提出了一种改进的BERT模型,用于功率域中的实体关系提取。该方法利用基于全词掩码的BERT模型生成具有上下文语义的句子向量、词向量,然后取词向量的平均值得到实体向量。注意机将句子向量和实体向量结合起来。最后,将组合的新向量放入一个全层中进行顺序标注,并寻找最优标签来实现实体提取对象。实验结果表明,当对传输过程的语料数据集进行实体提取时,该方法的准确率为90.12%,召回率为85.25%,F1分数为87.56%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Construction of Power Knowledge Graph based on Entity Relation Extraction
In order to integrate the fragmented text data in the power domain and solve the problems of disordered and weak correlation of transmission protocols, an improved BERT model was proposed by combining deep learning and knowledge graph for entity relationship extraction in the power domain. This method uses the BERT model based on a full word mask to generate sentence vectors, word vectors with contextual semantics, and then takes the average value of word vectors to get entity vectors. The sentence vectors and entity vectors are combined by the attention machine. Finally, the combined new vectors are put into a fully layer for sequential labeling and finding the optimal tag to implement the entity extracted object. The experimental results show that the precision, recall value, and F1 score of this method are 90.12%, 85.25%, and 87.56 % respectively when entity extraction is performed on the corpus data set of transmission procedures.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信