{"title":"结合边缘锐化和协方差注意的命名实体识别","authors":"Caiwei Yang, Yanping Chen, Shuai Yu, Ruizhang Huang, Yongbin Qin","doi":"10.1016/j.neucom.2025.130402","DOIUrl":null,"url":null,"abstract":"<div><div>Named Entity Recognition (NER) is a key application in the field of Artificial Intelligence and Natural Language Processing, which automatically identifies and categorizes entities in text by intelligent algorithms. In NER, all spans of a sentence can be organized into a two-dimensional representation. The semantic plane has the advantage to represent the semantic structure of a sentence and to learn the interaction between spans. One of an important phenomenon of this representation is that neighboring elements of the semantic plane are spans denoted to overlapped subsequences in a sentence. Because they share the same contextual features and semantic dependencies, it is difficult to distinguish true entities from the backgrounds. Therefore, refining span representations and building the semantic dependency between spans is helpful for the entity recognition task. In this paper, we propose an Edge Sharpening and Covariance Attention (ES&CA) model to support recognizing named entities from the semantic plane representation. The edge sharpening (ES) module adopts a differential convolution to sharpen the semantic gradients in the semantic plane, which has the ability to gather semantic information from neighborhoods. In the covariance attention (CA) module, the covariance between spans are applied to weight the attention of spans relevant to task-relevant learning objective. Establishing semantic relationships across spans is a highly successful approach. The ES&CA model is assessed on five public datasets for nested and flattened named entity recognition. The evaluation results demonstrate the effectiveness of our strategy in distinguishing entity spans from the backgrounds, hence significantly enhancing the final performance.</div></div>","PeriodicalId":19268,"journal":{"name":"Neurocomputing","volume":"643 ","pages":"Article 130402"},"PeriodicalIF":5.5000,"publicationDate":"2025-05-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Incorporating edge sharpening and covariance attention for named entity recognition\",\"authors\":\"Caiwei Yang, Yanping Chen, Shuai Yu, Ruizhang Huang, Yongbin Qin\",\"doi\":\"10.1016/j.neucom.2025.130402\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Named Entity Recognition (NER) is a key application in the field of Artificial Intelligence and Natural Language Processing, which automatically identifies and categorizes entities in text by intelligent algorithms. In NER, all spans of a sentence can be organized into a two-dimensional representation. The semantic plane has the advantage to represent the semantic structure of a sentence and to learn the interaction between spans. One of an important phenomenon of this representation is that neighboring elements of the semantic plane are spans denoted to overlapped subsequences in a sentence. Because they share the same contextual features and semantic dependencies, it is difficult to distinguish true entities from the backgrounds. Therefore, refining span representations and building the semantic dependency between spans is helpful for the entity recognition task. In this paper, we propose an Edge Sharpening and Covariance Attention (ES&CA) model to support recognizing named entities from the semantic plane representation. The edge sharpening (ES) module adopts a differential convolution to sharpen the semantic gradients in the semantic plane, which has the ability to gather semantic information from neighborhoods. In the covariance attention (CA) module, the covariance between spans are applied to weight the attention of spans relevant to task-relevant learning objective. Establishing semantic relationships across spans is a highly successful approach. The ES&CA model is assessed on five public datasets for nested and flattened named entity recognition. The evaluation results demonstrate the effectiveness of our strategy in distinguishing entity spans from the backgrounds, hence significantly enhancing the final performance.</div></div>\",\"PeriodicalId\":19268,\"journal\":{\"name\":\"Neurocomputing\",\"volume\":\"643 \",\"pages\":\"Article 130402\"},\"PeriodicalIF\":5.5000,\"publicationDate\":\"2025-05-16\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Neurocomputing\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0925231225010744\",\"RegionNum\":2,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neurocomputing","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0925231225010744","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
Incorporating edge sharpening and covariance attention for named entity recognition
Named Entity Recognition (NER) is a key application in the field of Artificial Intelligence and Natural Language Processing, which automatically identifies and categorizes entities in text by intelligent algorithms. In NER, all spans of a sentence can be organized into a two-dimensional representation. The semantic plane has the advantage to represent the semantic structure of a sentence and to learn the interaction between spans. One of an important phenomenon of this representation is that neighboring elements of the semantic plane are spans denoted to overlapped subsequences in a sentence. Because they share the same contextual features and semantic dependencies, it is difficult to distinguish true entities from the backgrounds. Therefore, refining span representations and building the semantic dependency between spans is helpful for the entity recognition task. In this paper, we propose an Edge Sharpening and Covariance Attention (ES&CA) model to support recognizing named entities from the semantic plane representation. The edge sharpening (ES) module adopts a differential convolution to sharpen the semantic gradients in the semantic plane, which has the ability to gather semantic information from neighborhoods. In the covariance attention (CA) module, the covariance between spans are applied to weight the attention of spans relevant to task-relevant learning objective. Establishing semantic relationships across spans is a highly successful approach. The ES&CA model is assessed on five public datasets for nested and flattened named entity recognition. The evaluation results demonstrate the effectiveness of our strategy in distinguishing entity spans from the backgrounds, hence significantly enhancing the final performance.
期刊介绍:
Neurocomputing publishes articles describing recent fundamental contributions in the field of neurocomputing. Neurocomputing theory, practice and applications are the essential topics being covered.