基于提示和对比学习的少镜头情感分类

Fei Wang, Long Chen, Xiaohua Huang, Cai Xu, Wei Zhao, Ziyu Guan, Guangyue Lu
{"title":"基于提示和对比学习的少镜头情感分类","authors":"Fei Wang, Long Chen, Xiaohua Huang, Cai Xu, Wei Zhao, Ziyu Guan, Guangyue Lu","doi":"10.1145/3573942.3573969","DOIUrl":null,"url":null,"abstract":"Sentiment classification is a hot topic in the field of natural language processing. Currently, state-of-the-art classification models follow two steps: pre-training a large language model on upstream tasks, and then using human-labeled data to fine-tune a task-related model. However, there is a large gap between the upstream tasks of the pre-trained model and the downstream tasks being performed, resulting in the need for more labeled data to achieve excellent performance. Manually annotating data is expensive. In this paper, we propose a few-shot sentiment classification method based on Prompt and Contrastive Learning (PCL), which can significantly improve the performance of large-scale pre-trained language models in low-data and high-data regimes. Prompt learning aims to alleviate the gap between upstream and downstream tasks, and the contrastive learning is designed to capture the inter-class and intra-class distribution patterns of labeled data. Thanks to the integration of the two strategies, PCL markedly exceeds baselines with low resources. Extensive experiments on three datasets show that our method has outstanding performance in the few-shot settings.","PeriodicalId":103293,"journal":{"name":"Proceedings of the 2022 5th International Conference on Artificial Intelligence and Pattern Recognition","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Prompt and Contrastive Learning for Few-shot Sentiment Classification\",\"authors\":\"Fei Wang, Long Chen, Xiaohua Huang, Cai Xu, Wei Zhao, Ziyu Guan, Guangyue Lu\",\"doi\":\"10.1145/3573942.3573969\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Sentiment classification is a hot topic in the field of natural language processing. Currently, state-of-the-art classification models follow two steps: pre-training a large language model on upstream tasks, and then using human-labeled data to fine-tune a task-related model. However, there is a large gap between the upstream tasks of the pre-trained model and the downstream tasks being performed, resulting in the need for more labeled data to achieve excellent performance. Manually annotating data is expensive. In this paper, we propose a few-shot sentiment classification method based on Prompt and Contrastive Learning (PCL), which can significantly improve the performance of large-scale pre-trained language models in low-data and high-data regimes. Prompt learning aims to alleviate the gap between upstream and downstream tasks, and the contrastive learning is designed to capture the inter-class and intra-class distribution patterns of labeled data. Thanks to the integration of the two strategies, PCL markedly exceeds baselines with low resources. Extensive experiments on three datasets show that our method has outstanding performance in the few-shot settings.\",\"PeriodicalId\":103293,\"journal\":{\"name\":\"Proceedings of the 2022 5th International Conference on Artificial Intelligence and Pattern Recognition\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-09-23\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 2022 5th International Conference on Artificial Intelligence and Pattern Recognition\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3573942.3573969\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2022 5th International Conference on Artificial Intelligence and Pattern Recognition","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3573942.3573969","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

情感分类是自然语言处理领域的研究热点。目前,最先进的分类模型遵循两个步骤:在上游任务上预训练大型语言模型,然后使用人工标记数据对任务相关模型进行微调。然而,预训练模型的上游任务与正在执行的下游任务之间存在较大差距,因此需要更多的标记数据才能达到优异的性能。手动标注数据的成本很高。本文提出了一种基于提示和对比学习(PCL)的少镜头情感分类方法,该方法可以显著提高大规模预训练语言模型在低数据和高数据下的性能。提示学习旨在缓解上游和下游任务之间的差距,对比学习旨在捕捉标记数据的类间和类内分布模式。由于两种策略的整合,PCL在资源较少的情况下明显超过了基线。在三个数据集上的大量实验表明,我们的方法在少镜头设置下具有出色的性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Prompt and Contrastive Learning for Few-shot Sentiment Classification
Sentiment classification is a hot topic in the field of natural language processing. Currently, state-of-the-art classification models follow two steps: pre-training a large language model on upstream tasks, and then using human-labeled data to fine-tune a task-related model. However, there is a large gap between the upstream tasks of the pre-trained model and the downstream tasks being performed, resulting in the need for more labeled data to achieve excellent performance. Manually annotating data is expensive. In this paper, we propose a few-shot sentiment classification method based on Prompt and Contrastive Learning (PCL), which can significantly improve the performance of large-scale pre-trained language models in low-data and high-data regimes. Prompt learning aims to alleviate the gap between upstream and downstream tasks, and the contrastive learning is designed to capture the inter-class and intra-class distribution patterns of labeled data. Thanks to the integration of the two strategies, PCL markedly exceeds baselines with low resources. Extensive experiments on three datasets show that our method has outstanding performance in the few-shot settings.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信