Yi Pan, Yong Zhou, He Liu, Jintao Zhang, Jiahua Wu
{"title":"Research on LatticeLSTM model based on data enhancement and self-attention mechanism","authors":"Yi Pan, Yong Zhou, He Liu, Jintao Zhang, Jiahua Wu","doi":"10.1117/12.2673458","DOIUrl":null,"url":null,"abstract":"(Named Entity Recognition, NER) and (Relation Extraction, RE) are two basic tasks in Natural LanguageProcessing, NLP). Due to the indistinguishable boundaries between entities in Chinese and the lack of obvious formal signs, Named entity recognition has always been a difficult point in the Chinese field. Although it has made good progress in Chinese, it still lacks the semantic understanding ability in special fields and the effect is not ideal. In this paper, the algorithm of deep learning and self-attention mechanism are deeply studied. By improving LatticeLSTM model and integrating self-attention mechanism, the ability to understand Chinese semantics is improved, and a small amount of labeled data is expanded by data enhancement to build a data set in a special field to complete the task of named entity recognition.","PeriodicalId":176918,"journal":{"name":"2nd International Conference on Digital Society and Intelligent Systems (DSInS 2022)","volume":"24 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-04-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2nd International Conference on Digital Society and Intelligent Systems (DSInS 2022)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1117/12.2673458","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
(Named Entity Recognition, NER) and (Relation Extraction, RE) are two basic tasks in Natural LanguageProcessing, NLP). Due to the indistinguishable boundaries between entities in Chinese and the lack of obvious formal signs, Named entity recognition has always been a difficult point in the Chinese field. Although it has made good progress in Chinese, it still lacks the semantic understanding ability in special fields and the effect is not ideal. In this paper, the algorithm of deep learning and self-attention mechanism are deeply studied. By improving LatticeLSTM model and integrating self-attention mechanism, the ability to understand Chinese semantics is improved, and a small amount of labeled data is expanded by data enhancement to build a data set in a special field to complete the task of named entity recognition.