Student-Teacher Oneness: A Storage-efficient approach that improves facial expression recognition

Zhenzhu Zheng, C. Rasmussen, Xi Peng
{"title":"Student-Teacher Oneness: A Storage-efficient approach that improves facial expression recognition","authors":"Zhenzhu Zheng, C. Rasmussen, Xi Peng","doi":"10.1109/ICCVW54120.2021.00453","DOIUrl":null,"url":null,"abstract":"We present Student-Teacher Oneness (STO), a simple but effective approach for online knowledge distillation improves facial expression recognition, without introducing any extra model parameters. Stochastic sub-networks are designed to replace the multi-branch architecture component in current online distillation methods. This leads to a simplified architecture, and yet competitive performances. Under the \"teacher-student\" framework, we construct both teacher and student within the same target network. Student network is the sub-networks which randomly skipping some portions of the full (target) network. The teacher network is the full network, can be considered as the ensemble of all possible student networks. The training process is performed in a closed-loop: (1) Forward prediction contains two passes that generate student and teacher predictions. (2) Backward distillation allows knowledge transfer from the teacher back to students. Comprehensive evaluations show that STO improves the generalization ability of a variety of deep neural networks to a significant margin. The results prove our superior performance in facial expression recognition task on FER-2013 and RAF.","PeriodicalId":226794,"journal":{"name":"2021 IEEE/CVF International Conference on Computer Vision Workshops (ICCVW)","volume":"12 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 IEEE/CVF International Conference on Computer Vision Workshops (ICCVW)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICCVW54120.2021.00453","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

We present Student-Teacher Oneness (STO), a simple but effective approach for online knowledge distillation improves facial expression recognition, without introducing any extra model parameters. Stochastic sub-networks are designed to replace the multi-branch architecture component in current online distillation methods. This leads to a simplified architecture, and yet competitive performances. Under the "teacher-student" framework, we construct both teacher and student within the same target network. Student network is the sub-networks which randomly skipping some portions of the full (target) network. The teacher network is the full network, can be considered as the ensemble of all possible student networks. The training process is performed in a closed-loop: (1) Forward prediction contains two passes that generate student and teacher predictions. (2) Backward distillation allows knowledge transfer from the teacher back to students. Comprehensive evaluations show that STO improves the generalization ability of a variety of deep neural networks to a significant margin. The results prove our superior performance in facial expression recognition task on FER-2013 and RAF.
师生合一:提高面部表情识别的高效存储方法
我们提出了一种简单而有效的在线知识提炼方法——学生-教师合一(STO),该方法在不引入任何额外模型参数的情况下提高了面部表情识别。设计了随机子网络来取代当前在线蒸馏方法中的多分支结构组件。这导致了一个简化的架构,但具有竞争力的性能。在“师生”框架下,我们在同一个目标网络中构建教师和学生。学生网络是随机跳过完整(目标)网络某些部分的子网络。教师网络是全网络,可以认为是所有可能的学生网络的集合。训练过程在闭环中进行:(1)前向预测包含两个步骤,分别生成学生和教师的预测。(2)逆向蒸馏允许知识从教师向学生转移。综合评价表明,STO在很大程度上提高了各种深度神经网络的泛化能力。结果证明了我们在fe -2013和RAF上的面部表情识别任务上的优越性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信