Qi Cheng , Liqiong Chen , Zhixing Hu , Juan Tang , Qiang Xu , Binbin Ning
{"title":"一种通过 LLMs 进行少量 NER 的新型提示方法","authors":"Qi Cheng , Liqiong Chen , Zhixing Hu , Juan Tang , Qiang Xu , Binbin Ning","doi":"10.1016/j.nlp.2024.100099","DOIUrl":null,"url":null,"abstract":"<div><p>In various natural language processing tasks, significant strides have been made by Large Language Models (LLMs). Researchers leverage prompt method to conduct LLMs in accomplishing specific tasks under few-shot conditions. However, the prevalent use of LLMs’ prompt methods mainly focuses on guiding generative tasks, and employing existing prompts may result in poor performance in Named Entity Recognition (NER) tasks. To tackle this challenge, we propose a novel prompting method for few-shot NER. By enhancing existing prompt methods, we devise a standardized prompts tailored for the utilization of LLMs in NER tasks. Specifically, we structure the prompts into three components: task definition, few-shot demonstration, and output format. The task definition conducts LLMs in performing NER tasks, few-shot demonstration assists LLMs in understanding NER task objectives through specific output demonstration, and output format restricts LLMs’ output to prevent the generation of unnecessary results. The content of these components has been specifically tailored for NER tasks. Moreover, for the few-shot demonstration within the prompts, we propose a selection strategy that utilizes feedback from LLMs’ outputs to identify more suitable few-shot demonstration as prompts. Additionally, to enhance entity recognition performance, we enrich the prompts by summarizing error examples from the output process of LLMs and integrating them as additional prompts.</p></div>","PeriodicalId":100944,"journal":{"name":"Natural Language Processing Journal","volume":"8 ","pages":"Article 100099"},"PeriodicalIF":0.0000,"publicationDate":"2024-08-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S2949719124000475/pdfft?md5=e7e56213f461ce5ea69e8b3be1581d14&pid=1-s2.0-S2949719124000475-main.pdf","citationCount":"0","resultStr":"{\"title\":\"A novel prompting method for few-shot NER via LLMs\",\"authors\":\"Qi Cheng , Liqiong Chen , Zhixing Hu , Juan Tang , Qiang Xu , Binbin Ning\",\"doi\":\"10.1016/j.nlp.2024.100099\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>In various natural language processing tasks, significant strides have been made by Large Language Models (LLMs). Researchers leverage prompt method to conduct LLMs in accomplishing specific tasks under few-shot conditions. However, the prevalent use of LLMs’ prompt methods mainly focuses on guiding generative tasks, and employing existing prompts may result in poor performance in Named Entity Recognition (NER) tasks. To tackle this challenge, we propose a novel prompting method for few-shot NER. By enhancing existing prompt methods, we devise a standardized prompts tailored for the utilization of LLMs in NER tasks. Specifically, we structure the prompts into three components: task definition, few-shot demonstration, and output format. The task definition conducts LLMs in performing NER tasks, few-shot demonstration assists LLMs in understanding NER task objectives through specific output demonstration, and output format restricts LLMs’ output to prevent the generation of unnecessary results. The content of these components has been specifically tailored for NER tasks. Moreover, for the few-shot demonstration within the prompts, we propose a selection strategy that utilizes feedback from LLMs’ outputs to identify more suitable few-shot demonstration as prompts. Additionally, to enhance entity recognition performance, we enrich the prompts by summarizing error examples from the output process of LLMs and integrating them as additional prompts.</p></div>\",\"PeriodicalId\":100944,\"journal\":{\"name\":\"Natural Language Processing Journal\",\"volume\":\"8 \",\"pages\":\"Article 100099\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-08-24\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.sciencedirect.com/science/article/pii/S2949719124000475/pdfft?md5=e7e56213f461ce5ea69e8b3be1581d14&pid=1-s2.0-S2949719124000475-main.pdf\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Natural Language Processing Journal\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S2949719124000475\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Natural Language Processing Journal","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2949719124000475","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
摘要
在各种自然语言处理任务中,大型语言模型(LLM)取得了长足的进步。研究人员利用提示方法来引导 LLMs 在少量条件下完成特定任务。然而,LLMs 的提示方法主要集中于指导生成任务,而使用现有的提示方法可能会导致命名实体识别(NER)任务的性能不佳。为了应对这一挑战,我们提出了一种新颖的少拍 NER 提示方法。通过改进现有的提示方法,我们设计了一种标准化的提示方法,专为在 NER 任务中使用 LLMs 量身定制。具体来说,我们将提示结构分为三个部分:任务定义、少拍演示和输出格式。任务定义指导 LLM 执行 NER 任务,少量演示通过具体的输出演示帮助 LLM 理解 NER 任务目标,而输出格式则限制 LLM 的输出以防止产生不必要的结果。这些组件的内容都是专门针对 NER 任务量身定制的。此外,对于提示中的短片演示,我们提出了一种选择策略,利用 LLMs 的输出反馈来确定更合适的短片演示作为提示。此外,为了提高实体识别性能,我们从 LLMs 的输出过程中总结出错误示例,并将其整合为附加提示,从而丰富了提示内容。
A novel prompting method for few-shot NER via LLMs
In various natural language processing tasks, significant strides have been made by Large Language Models (LLMs). Researchers leverage prompt method to conduct LLMs in accomplishing specific tasks under few-shot conditions. However, the prevalent use of LLMs’ prompt methods mainly focuses on guiding generative tasks, and employing existing prompts may result in poor performance in Named Entity Recognition (NER) tasks. To tackle this challenge, we propose a novel prompting method for few-shot NER. By enhancing existing prompt methods, we devise a standardized prompts tailored for the utilization of LLMs in NER tasks. Specifically, we structure the prompts into three components: task definition, few-shot demonstration, and output format. The task definition conducts LLMs in performing NER tasks, few-shot demonstration assists LLMs in understanding NER task objectives through specific output demonstration, and output format restricts LLMs’ output to prevent the generation of unnecessary results. The content of these components has been specifically tailored for NER tasks. Moreover, for the few-shot demonstration within the prompts, we propose a selection strategy that utilizes feedback from LLMs’ outputs to identify more suitable few-shot demonstration as prompts. Additionally, to enhance entity recognition performance, we enrich the prompts by summarizing error examples from the output process of LLMs and integrating them as additional prompts.