基于变换的航天中文命名实体识别长实体注意模型

Shuai Gong, Xiong Xiong, Yunfei Liu, Shengyang Li, Anqi Liu
{"title":"基于变换的航天中文命名实体识别长实体注意模型","authors":"Shuai Gong, Xiong Xiong, Yunfei Liu, Shengyang Li, Anqi Liu","doi":"10.1109/AEMCSE55572.2022.00077","DOIUrl":null,"url":null,"abstract":"Chinese aerospace knowledge includes many long entities, such as professional terms, equipment names, and cabinets. However, current Named Entity Recognition (NER) algorithms typically address these longer and shorter entities uniformly. In this paper, a Longer Entity Attention (LEA) model based on the transformer is proposed. After the transformer encoding layer, LEA integrates sentence tags, sets thresholds according to the length of entities, and processes the hidden layer features of entities larger than the defined threshold to enhance the ability of the model to recognize longer entities. In addition, we construct an Aerospace Chinese NER dataset (ACNE) containing rich entity categories and domain knowledge. Experimental results demonstrate that LEA outperforms previous state-of-the-art models on ACNE, and shows a significant improvement on longer entities in each threshold range on OntoNotes 5.0 and ACNE datasets.","PeriodicalId":309096,"journal":{"name":"2022 5th International Conference on Advanced Electronic Materials, Computers and Software Engineering (AEMCSE)","volume":"111 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A Transformer-Based Longer Entity Attention Model for Chinese Named Entity Recognition in Aerospace\",\"authors\":\"Shuai Gong, Xiong Xiong, Yunfei Liu, Shengyang Li, Anqi Liu\",\"doi\":\"10.1109/AEMCSE55572.2022.00077\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Chinese aerospace knowledge includes many long entities, such as professional terms, equipment names, and cabinets. However, current Named Entity Recognition (NER) algorithms typically address these longer and shorter entities uniformly. In this paper, a Longer Entity Attention (LEA) model based on the transformer is proposed. After the transformer encoding layer, LEA integrates sentence tags, sets thresholds according to the length of entities, and processes the hidden layer features of entities larger than the defined threshold to enhance the ability of the model to recognize longer entities. In addition, we construct an Aerospace Chinese NER dataset (ACNE) containing rich entity categories and domain knowledge. Experimental results demonstrate that LEA outperforms previous state-of-the-art models on ACNE, and shows a significant improvement on longer entities in each threshold range on OntoNotes 5.0 and ACNE datasets.\",\"PeriodicalId\":309096,\"journal\":{\"name\":\"2022 5th International Conference on Advanced Electronic Materials, Computers and Software Engineering (AEMCSE)\",\"volume\":\"111 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-04-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 5th International Conference on Advanced Electronic Materials, Computers and Software Engineering (AEMCSE)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/AEMCSE55572.2022.00077\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 5th International Conference on Advanced Electronic Materials, Computers and Software Engineering (AEMCSE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/AEMCSE55572.2022.00077","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

中国的航天知识包括许多长实体,如专业术语、设备名称和机柜。然而,当前的命名实体识别(NER)算法通常统一地处理这些较长和较短的实体。本文提出了一种基于变压器的长实体注意(LEA)模型。在转换编码层之后,LEA集成句子标签,根据实体的长度设置阈值,对大于定义阈值的实体的隐藏层特征进行处理,增强模型对较长实体的识别能力。此外,我们构建了一个包含丰富实体类别和领域知识的航空航天中文NER数据集(ACNE)。实验结果表明,LEA在痤疮上优于以前的最先进的模型,并且在OntoNotes 5.0和痤疮数据集上,在每个阈值范围内对较长的实体都有显着改善。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
A Transformer-Based Longer Entity Attention Model for Chinese Named Entity Recognition in Aerospace
Chinese aerospace knowledge includes many long entities, such as professional terms, equipment names, and cabinets. However, current Named Entity Recognition (NER) algorithms typically address these longer and shorter entities uniformly. In this paper, a Longer Entity Attention (LEA) model based on the transformer is proposed. After the transformer encoding layer, LEA integrates sentence tags, sets thresholds according to the length of entities, and processes the hidden layer features of entities larger than the defined threshold to enhance the ability of the model to recognize longer entities. In addition, we construct an Aerospace Chinese NER dataset (ACNE) containing rich entity categories and domain knowledge. Experimental results demonstrate that LEA outperforms previous state-of-the-art models on ACNE, and shows a significant improvement on longer entities in each threshold range on OntoNotes 5.0 and ACNE datasets.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信