Research on LatticeLSTM model based on data enhancement and self-attention mechanism

Yi Pan, Yong Zhou, He Liu, Jintao Zhang, Jiahua Wu
{"title":"Research on LatticeLSTM model based on data enhancement and self-attention mechanism","authors":"Yi Pan, Yong Zhou, He Liu, Jintao Zhang, Jiahua Wu","doi":"10.1117/12.2673458","DOIUrl":null,"url":null,"abstract":"(Named Entity Recognition, NER) and (Relation Extraction, RE) are two basic tasks in Natural LanguageProcessing, NLP). Due to the indistinguishable boundaries between entities in Chinese and the lack of obvious formal signs, Named entity recognition has always been a difficult point in the Chinese field. Although it has made good progress in Chinese, it still lacks the semantic understanding ability in special fields and the effect is not ideal. In this paper, the algorithm of deep learning and self-attention mechanism are deeply studied. By improving LatticeLSTM model and integrating self-attention mechanism, the ability to understand Chinese semantics is improved, and a small amount of labeled data is expanded by data enhancement to build a data set in a special field to complete the task of named entity recognition.","PeriodicalId":176918,"journal":{"name":"2nd International Conference on Digital Society and Intelligent Systems (DSInS 2022)","volume":"24 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-04-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2nd International Conference on Digital Society and Intelligent Systems (DSInS 2022)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1117/12.2673458","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

(Named Entity Recognition, NER) and (Relation Extraction, RE) are two basic tasks in Natural LanguageProcessing, NLP). Due to the indistinguishable boundaries between entities in Chinese and the lack of obvious formal signs, Named entity recognition has always been a difficult point in the Chinese field. Although it has made good progress in Chinese, it still lacks the semantic understanding ability in special fields and the effect is not ideal. In this paper, the algorithm of deep learning and self-attention mechanism are deeply studied. By improving LatticeLSTM model and integrating self-attention mechanism, the ability to understand Chinese semantics is improved, and a small amount of labeled data is expanded by data enhancement to build a data set in a special field to complete the task of named entity recognition.
基于数据增强和自注意机制的LatticeLSTM模型研究
命名实体识别(NER)和关系抽取(RE)是自然语言处理(NLP)中的两个基本任务。由于中文实体之间的边界难以区分,且缺乏明显的形式符号,命名实体识别一直是中文领域的一个难点。虽然它在汉语方面取得了不错的进展,但在特殊领域的语义理解能力仍然不足,效果并不理想。本文对深度学习算法和自注意机制进行了深入的研究。通过改进LatticeLSTM模型,集成自注意机制,提高对中文语义的理解能力,并通过数据增强对少量标注数据进行扩展,构建特定领域的数据集,完成命名实体识别任务。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信