An adaptive knowledge distillation algorithm for text classification

Zuqin Chen, Tingkai Hu, Chao Chen, Jike Ge, Chengzhi Wu, Wenjun Cheng
{"title":"An adaptive knowledge distillation algorithm for text classification","authors":"Zuqin Chen, Tingkai Hu, Chao Chen, Jike Ge, Chengzhi Wu, Wenjun Cheng","doi":"10.1109/ICESIT53460.2021.9696948","DOIUrl":null,"url":null,"abstract":"Using knowledge distillation to compress pre-trained models such as Bert has proven to be highly effective in text classification tasks. However, the overhead of tuning parameters manually still hinders their application in practice. To alleviate the cost of manual tuning of parameters in training tasks, inspired by the inverse decrease of the word frequency of TF-IDF, this paper proposes an adaptive knowledge distillation method (AKD). This core idea of the method is based on the Cosine similarity score which is calculated by the probabilistic outputs similarity measurement in two networks. The higher the score, the closer the student model's understanding of knowledge is to the teacher model, and the lower the degree of imitation of the teacher model. On the contrary, we need to increase the degree to which the student model imitates the teacher model. Interestingly, this method can improve distillation model quality. Experimental results show that the proposed method significantly improves the precision, recall and F1 value of text classification tasks. However, training speed of AKD is slightly slower than baseline models. This study provides new insights into knowledge distillation.","PeriodicalId":164745,"journal":{"name":"2021 IEEE International Conference on Emergency Science and Information Technology (ICESIT)","volume":"24 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-11-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 IEEE International Conference on Emergency Science and Information Technology (ICESIT)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICESIT53460.2021.9696948","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Using knowledge distillation to compress pre-trained models such as Bert has proven to be highly effective in text classification tasks. However, the overhead of tuning parameters manually still hinders their application in practice. To alleviate the cost of manual tuning of parameters in training tasks, inspired by the inverse decrease of the word frequency of TF-IDF, this paper proposes an adaptive knowledge distillation method (AKD). This core idea of the method is based on the Cosine similarity score which is calculated by the probabilistic outputs similarity measurement in two networks. The higher the score, the closer the student model's understanding of knowledge is to the teacher model, and the lower the degree of imitation of the teacher model. On the contrary, we need to increase the degree to which the student model imitates the teacher model. Interestingly, this method can improve distillation model quality. Experimental results show that the proposed method significantly improves the precision, recall and F1 value of text classification tasks. However, training speed of AKD is slightly slower than baseline models. This study provides new insights into knowledge distillation.
一种用于文本分类的自适应知识蒸馏算法
使用知识蒸馏来压缩预训练模型(如Bert)已被证明在文本分类任务中非常有效。然而,手动调优参数的开销仍然阻碍了它们在实践中的应用。为了减轻训练任务中手动调优参数的成本,受TF-IDF词频逆降的启发,提出了一种自适应知识蒸馏方法(AKD)。该方法的核心思想是基于余弦相似度分数,余弦相似度分数是通过两个网络的概率输出相似度度量来计算的。得分越高,学生模式对知识的理解越接近教师模式,对教师模式的模仿程度越低。相反,我们需要增加学生模式模仿教师模式的程度。有趣的是,这种方法可以提高蒸馏模型的质量。实验结果表明,该方法显著提高了文本分类任务的查全率、查全率和F1值。然而,AKD的训练速度比基线模型略慢。本研究为知识蒸馏提供了新的见解。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信