面向面向情感任务序列连续分类的Bert适配与对比学习

{"title":"面向面向情感任务序列连续分类的Bert适配与对比学习","authors":"","doi":"10.15625/2525-2518/17395","DOIUrl":null,"url":null,"abstract":"Task incremental learning, a setting of Continual learning, isan approach to exploit the knowledge from previous tasks for currentlynew task. Task incremental learning aims to solve two big challengesof continual learning: catastrophic forgetting and knowledge transfer orsharing between previous tasks and current task. This paper improveTask incremental learning by (1) transferring the knowledge (not thetraining data) learned from previous tasks to a new task (contrast ofmulti-task learning); (2) to maintain or even improve performance oflearned models from previous tasks with avoid forgetting; (3) to developa continual learning model based on result from (1) and (2) to applyfor aspect sentiment classification. Specifically, we combine two loss baseon contrastive learning modules from Contrastive Knowledge Sharing(CKS) for encouraging knowledge sharing between old and current tasksand improve the performance of the current task by Contrastive Super-vised learning (CSC) module. The experimental results show that ourmethod could get rid of previous learned tasks catastrophic forgettingphenomenon and outperform the previous study for aspect sentimentclassification.","PeriodicalId":23553,"journal":{"name":"Vietnam Journal of Science and Technology","volume":"8 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2023-04-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Bert Adapter and Contrastive Learning for Continual Classification of Aspect Sentiment Task Sequences\",\"authors\":\"\",\"doi\":\"10.15625/2525-2518/17395\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Task incremental learning, a setting of Continual learning, isan approach to exploit the knowledge from previous tasks for currentlynew task. Task incremental learning aims to solve two big challengesof continual learning: catastrophic forgetting and knowledge transfer orsharing between previous tasks and current task. This paper improveTask incremental learning by (1) transferring the knowledge (not thetraining data) learned from previous tasks to a new task (contrast ofmulti-task learning); (2) to maintain or even improve performance oflearned models from previous tasks with avoid forgetting; (3) to developa continual learning model based on result from (1) and (2) to applyfor aspect sentiment classification. Specifically, we combine two loss baseon contrastive learning modules from Contrastive Knowledge Sharing(CKS) for encouraging knowledge sharing between old and current tasksand improve the performance of the current task by Contrastive Super-vised learning (CSC) module. The experimental results show that ourmethod could get rid of previous learned tasks catastrophic forgettingphenomenon and outperform the previous study for aspect sentimentclassification.\",\"PeriodicalId\":23553,\"journal\":{\"name\":\"Vietnam Journal of Science and Technology\",\"volume\":\"8 1\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-04-05\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Vietnam Journal of Science and Technology\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.15625/2525-2518/17395\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Vietnam Journal of Science and Technology","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.15625/2525-2518/17395","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

任务增量学习是一种持续学习的设置,是一种利用以前任务中的知识来完成当前新任务的方法。任务增量学习旨在解决持续学习的两大挑战:灾难性遗忘和前一任务与当前任务之间的知识转移或共享。本文通过(1)将从前一个任务中学习到的知识(不是训练数据)转移到一个新的任务(多任务学习的对比)来改进任务增量学习;(2)保持甚至提高从以前的任务中学习到的模型的性能,避免遗忘;(3)基于(1)和(2)的结果开发持续学习模型,应用于方面情感分类。具体来说,我们结合对比知识共享(对比Knowledge Sharing, CKS)的两个基于损失的对比学习模块,鼓励旧任务和当前任务之间的知识共享,并通过对比监督学习(contrast supervised learning, CSC)模块提高当前任务的性能。实验结果表明,该方法能够有效地消除以往学习任务的灾难性遗忘现象,在方面情感分类方面优于以往的研究。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Bert Adapter and Contrastive Learning for Continual Classification of Aspect Sentiment Task Sequences
Task incremental learning, a setting of Continual learning, isan approach to exploit the knowledge from previous tasks for currentlynew task. Task incremental learning aims to solve two big challengesof continual learning: catastrophic forgetting and knowledge transfer orsharing between previous tasks and current task. This paper improveTask incremental learning by (1) transferring the knowledge (not thetraining data) learned from previous tasks to a new task (contrast ofmulti-task learning); (2) to maintain or even improve performance oflearned models from previous tasks with avoid forgetting; (3) to developa continual learning model based on result from (1) and (2) to applyfor aspect sentiment classification. Specifically, we combine two loss baseon contrastive learning modules from Contrastive Knowledge Sharing(CKS) for encouraging knowledge sharing between old and current tasksand improve the performance of the current task by Contrastive Super-vised learning (CSC) module. The experimental results show that ourmethod could get rid of previous learned tasks catastrophic forgettingphenomenon and outperform the previous study for aspect sentimentclassification.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
0.50
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信