A novel prompting method for few-shot NER via LLMs

Qi Cheng , Liqiong Chen , Zhixing Hu , Juan Tang , Qiang Xu , Binbin Ning
{"title":"A novel prompting method for few-shot NER via LLMs","authors":"Qi Cheng ,&nbsp;Liqiong Chen ,&nbsp;Zhixing Hu ,&nbsp;Juan Tang ,&nbsp;Qiang Xu ,&nbsp;Binbin Ning","doi":"10.1016/j.nlp.2024.100099","DOIUrl":null,"url":null,"abstract":"<div><p>In various natural language processing tasks, significant strides have been made by Large Language Models (LLMs). Researchers leverage prompt method to conduct LLMs in accomplishing specific tasks under few-shot conditions. However, the prevalent use of LLMs’ prompt methods mainly focuses on guiding generative tasks, and employing existing prompts may result in poor performance in Named Entity Recognition (NER) tasks. To tackle this challenge, we propose a novel prompting method for few-shot NER. By enhancing existing prompt methods, we devise a standardized prompts tailored for the utilization of LLMs in NER tasks. Specifically, we structure the prompts into three components: task definition, few-shot demonstration, and output format. The task definition conducts LLMs in performing NER tasks, few-shot demonstration assists LLMs in understanding NER task objectives through specific output demonstration, and output format restricts LLMs’ output to prevent the generation of unnecessary results. The content of these components has been specifically tailored for NER tasks. Moreover, for the few-shot demonstration within the prompts, we propose a selection strategy that utilizes feedback from LLMs’ outputs to identify more suitable few-shot demonstration as prompts. Additionally, to enhance entity recognition performance, we enrich the prompts by summarizing error examples from the output process of LLMs and integrating them as additional prompts.</p></div>","PeriodicalId":100944,"journal":{"name":"Natural Language Processing Journal","volume":"8 ","pages":"Article 100099"},"PeriodicalIF":0.0000,"publicationDate":"2024-08-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S2949719124000475/pdfft?md5=e7e56213f461ce5ea69e8b3be1581d14&pid=1-s2.0-S2949719124000475-main.pdf","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Natural Language Processing Journal","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2949719124000475","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

In various natural language processing tasks, significant strides have been made by Large Language Models (LLMs). Researchers leverage prompt method to conduct LLMs in accomplishing specific tasks under few-shot conditions. However, the prevalent use of LLMs’ prompt methods mainly focuses on guiding generative tasks, and employing existing prompts may result in poor performance in Named Entity Recognition (NER) tasks. To tackle this challenge, we propose a novel prompting method for few-shot NER. By enhancing existing prompt methods, we devise a standardized prompts tailored for the utilization of LLMs in NER tasks. Specifically, we structure the prompts into three components: task definition, few-shot demonstration, and output format. The task definition conducts LLMs in performing NER tasks, few-shot demonstration assists LLMs in understanding NER task objectives through specific output demonstration, and output format restricts LLMs’ output to prevent the generation of unnecessary results. The content of these components has been specifically tailored for NER tasks. Moreover, for the few-shot demonstration within the prompts, we propose a selection strategy that utilizes feedback from LLMs’ outputs to identify more suitable few-shot demonstration as prompts. Additionally, to enhance entity recognition performance, we enrich the prompts by summarizing error examples from the output process of LLMs and integrating them as additional prompts.

一种通过 LLMs 进行少量 NER 的新型提示方法
在各种自然语言处理任务中,大型语言模型(LLM)取得了长足的进步。研究人员利用提示方法来引导 LLMs 在少量条件下完成特定任务。然而,LLMs 的提示方法主要集中于指导生成任务,而使用现有的提示方法可能会导致命名实体识别(NER)任务的性能不佳。为了应对这一挑战,我们提出了一种新颖的少拍 NER 提示方法。通过改进现有的提示方法,我们设计了一种标准化的提示方法,专为在 NER 任务中使用 LLMs 量身定制。具体来说,我们将提示结构分为三个部分:任务定义、少拍演示和输出格式。任务定义指导 LLM 执行 NER 任务,少量演示通过具体的输出演示帮助 LLM 理解 NER 任务目标,而输出格式则限制 LLM 的输出以防止产生不必要的结果。这些组件的内容都是专门针对 NER 任务量身定制的。此外,对于提示中的短片演示,我们提出了一种选择策略,利用 LLMs 的输出反馈来确定更合适的短片演示作为提示。此外,为了提高实体识别性能,我们从 LLMs 的输出过程中总结出错误示例,并将其整合为附加提示,从而丰富了提示内容。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信