SAKP: A Korean Sentiment Analysis Model via Knowledge Base and Prompt Tuning

Haiqiang Wen, Zhenguo Zhang
{"title":"SAKP: A Korean Sentiment Analysis Model via Knowledge Base and Prompt Tuning","authors":"Haiqiang Wen, Zhenguo Zhang","doi":"10.1109/CCAI57533.2023.10201257","DOIUrl":null,"url":null,"abstract":"With the help of pre-trained language models, tasks such as sentiment analysis and text classification have achieved good results. With the advent of prompt tuning, especially previous studies have shown that in the case of few data, the prompt tuning method has significant advantages over the general tuning method of additional classifiers. At present, there are relatively few studies on sentiment analysis of Korean Chinese texts.This paper proposes a low resource sentiment classification method based on pre-trained language models (PLMs) combined with prompt tuning. In this work, we chose to use the pre-trained language model KLUE and elaborated a Korean prompt template with an expanded knowledge base and filtering in the verbalizer section. We focus on collecting external knowledge and integrating it into the utterance to form a prompt tuning of knowledge to improve and stabilize the prompt tuning. Specifically, we use the K-means clustering algorithm to construct the label wordspace of the external knowledge base (kb) extended language, and use PLM itself to refine the extended labeled wordspace before using the extended labeled wordspace for prediction. A large number of experiments on the few-shot emotion classification task prove the effectiveness of knowledge prompt tuning.","PeriodicalId":285760,"journal":{"name":"2023 IEEE 3rd International Conference on Computer Communication and Artificial Intelligence (CCAI)","volume":"3 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-05-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 IEEE 3rd International Conference on Computer Communication and Artificial Intelligence (CCAI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CCAI57533.2023.10201257","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

With the help of pre-trained language models, tasks such as sentiment analysis and text classification have achieved good results. With the advent of prompt tuning, especially previous studies have shown that in the case of few data, the prompt tuning method has significant advantages over the general tuning method of additional classifiers. At present, there are relatively few studies on sentiment analysis of Korean Chinese texts.This paper proposes a low resource sentiment classification method based on pre-trained language models (PLMs) combined with prompt tuning. In this work, we chose to use the pre-trained language model KLUE and elaborated a Korean prompt template with an expanded knowledge base and filtering in the verbalizer section. We focus on collecting external knowledge and integrating it into the utterance to form a prompt tuning of knowledge to improve and stabilize the prompt tuning. Specifically, we use the K-means clustering algorithm to construct the label wordspace of the external knowledge base (kb) extended language, and use PLM itself to refine the extended labeled wordspace before using the extended labeled wordspace for prediction. A large number of experiments on the few-shot emotion classification task prove the effectiveness of knowledge prompt tuning.
基于知识库和提示调优的韩文情感分析模型
在预训练语言模型的帮助下,情感分析和文本分类等任务都取得了很好的效果。随着提示调优的出现,特别是以往的研究表明,在数据较少的情况下,提示调优方法比一般的附加分类器调优方法具有显著的优势。目前,对朝鲜语篇情感分析的研究相对较少。提出了一种基于预训练语言模型和提示调优相结合的低资源情感分类方法。在这项工作中,我们选择使用预先训练的语言模型KLUE,并在语言分析器部分详细阐述了一个具有扩展知识库和过滤功能的韩语提示模板。我们注重收集外部知识并将其整合到话语中,形成知识的提示调音,以完善和稳定提示调音。具体而言,我们使用K-means聚类算法构建外部知识库(kb)扩展语言的标记词空间,并使用PLM本身对扩展的标记词空间进行细化,然后使用扩展的标记词空间进行预测。在少镜头情绪分类任务上的大量实验证明了知识提示调整的有效性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信