Hong Ming , Jiaoyun Yang , Shuo Liu , Lili Jiang , Ning An
{"title":"利用高质量的伪标签进行鲁棒的少镜头嵌套命名实体识别","authors":"Hong Ming , Jiaoyun Yang , Shuo Liu , Lili Jiang , Ning An","doi":"10.1016/j.engappai.2025.110992","DOIUrl":null,"url":null,"abstract":"<div><div>Few-shot Named Entity Recognition (NER) methods have shown initial effectiveness in flat NER tasks. However, these methods often prioritize optimizing models with a small annotated support set, neglecting the high-quality data within the unlabeled query set. Furthermore, existing few-shot NER models struggle with nested entity challenges due to linguistic or structural complexities. In this study, we introduce <strong>R</strong>etrieving h<strong>i</strong>gh-quality pseudo-label <strong>T</strong>uning, RiTNER, a framework designed to address few-shot nested named entity recognition tasks by leveraging high-quality data from the query set. RiTNER comprises two main components: (1) contrastive span classification, which clusters entities into corresponding prototypes and generates high-quality pseudo-labels from the unlabeled data, and (2) masked pseudo-data tuning, which generates a masked pseudo dataset and then uses it to optimize the model and enhance span classification. We train RiTNER on an English dataset and evaluate it on both English nested datasets and cross-lingual nested datasets. The results show that RiTNER outperforms the top-performing baseline models by 1.67%, and 3.04% in the English 5-shot task, as well as the cross-lingual 5-shot tasks, respectively.</div></div>","PeriodicalId":50523,"journal":{"name":"Engineering Applications of Artificial Intelligence","volume":"156 ","pages":"Article 110992"},"PeriodicalIF":7.5000,"publicationDate":"2025-05-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Harnessing high-quality pseudo-labels for robust few-shot nested named entity recognition\",\"authors\":\"Hong Ming , Jiaoyun Yang , Shuo Liu , Lili Jiang , Ning An\",\"doi\":\"10.1016/j.engappai.2025.110992\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Few-shot Named Entity Recognition (NER) methods have shown initial effectiveness in flat NER tasks. However, these methods often prioritize optimizing models with a small annotated support set, neglecting the high-quality data within the unlabeled query set. Furthermore, existing few-shot NER models struggle with nested entity challenges due to linguistic or structural complexities. In this study, we introduce <strong>R</strong>etrieving h<strong>i</strong>gh-quality pseudo-label <strong>T</strong>uning, RiTNER, a framework designed to address few-shot nested named entity recognition tasks by leveraging high-quality data from the query set. RiTNER comprises two main components: (1) contrastive span classification, which clusters entities into corresponding prototypes and generates high-quality pseudo-labels from the unlabeled data, and (2) masked pseudo-data tuning, which generates a masked pseudo dataset and then uses it to optimize the model and enhance span classification. We train RiTNER on an English dataset and evaluate it on both English nested datasets and cross-lingual nested datasets. The results show that RiTNER outperforms the top-performing baseline models by 1.67%, and 3.04% in the English 5-shot task, as well as the cross-lingual 5-shot tasks, respectively.</div></div>\",\"PeriodicalId\":50523,\"journal\":{\"name\":\"Engineering Applications of Artificial Intelligence\",\"volume\":\"156 \",\"pages\":\"Article 110992\"},\"PeriodicalIF\":7.5000,\"publicationDate\":\"2025-05-21\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Engineering Applications of Artificial Intelligence\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0952197625009923\",\"RegionNum\":2,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"AUTOMATION & CONTROL SYSTEMS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Engineering Applications of Artificial Intelligence","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0952197625009923","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AUTOMATION & CONTROL SYSTEMS","Score":null,"Total":0}
Harnessing high-quality pseudo-labels for robust few-shot nested named entity recognition
Few-shot Named Entity Recognition (NER) methods have shown initial effectiveness in flat NER tasks. However, these methods often prioritize optimizing models with a small annotated support set, neglecting the high-quality data within the unlabeled query set. Furthermore, existing few-shot NER models struggle with nested entity challenges due to linguistic or structural complexities. In this study, we introduce Retrieving high-quality pseudo-label Tuning, RiTNER, a framework designed to address few-shot nested named entity recognition tasks by leveraging high-quality data from the query set. RiTNER comprises two main components: (1) contrastive span classification, which clusters entities into corresponding prototypes and generates high-quality pseudo-labels from the unlabeled data, and (2) masked pseudo-data tuning, which generates a masked pseudo dataset and then uses it to optimize the model and enhance span classification. We train RiTNER on an English dataset and evaluate it on both English nested datasets and cross-lingual nested datasets. The results show that RiTNER outperforms the top-performing baseline models by 1.67%, and 3.04% in the English 5-shot task, as well as the cross-lingual 5-shot tasks, respectively.
期刊介绍:
Artificial Intelligence (AI) is pivotal in driving the fourth industrial revolution, witnessing remarkable advancements across various machine learning methodologies. AI techniques have become indispensable tools for practicing engineers, enabling them to tackle previously insurmountable challenges. Engineering Applications of Artificial Intelligence serves as a global platform for the swift dissemination of research elucidating the practical application of AI methods across all engineering disciplines. Submitted papers are expected to present novel aspects of AI utilized in real-world engineering applications, validated using publicly available datasets to ensure the replicability of research outcomes. Join us in exploring the transformative potential of AI in engineering.