OCFER-Net: Recognizing Facial Expression in Online Learning System

Yi Huo, L. Zhang
{"title":"OCFER-Net: Recognizing Facial Expression in Online Learning System","authors":"Yi Huo, L. Zhang","doi":"10.1145/3531073.3534470","DOIUrl":null,"url":null,"abstract":"Recently, online learning is very popular, especially under the global epidemic of COVID-19. Besides knowledge distribution, emotion interaction is also very important. It can be obtained by employing Facial Expression Recognition (FER). Since the FER accuracy is substantial in assisting teachers to acquire the emotional situation, the project explores a series of FER methods and finds that few works engage in exploiting the orthogonality of convolutional matrix. Therefore, it enforces orthogonality on kernels by a regularizer, which extracts features with more diversity and expressiveness, and delivers OCFER-Net. Experiments are carried out on FER-2013, which is a challenging dataset. Results show superior performance over baselines by 1.087. The code of the research project is publicly available on https://github.com/YeeHoran/OCFERNet..","PeriodicalId":412533,"journal":{"name":"Proceedings of the 2022 International Conference on Advanced Visual Interfaces","volume":"28 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-06-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2022 International Conference on Advanced Visual Interfaces","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3531073.3534470","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

Recently, online learning is very popular, especially under the global epidemic of COVID-19. Besides knowledge distribution, emotion interaction is also very important. It can be obtained by employing Facial Expression Recognition (FER). Since the FER accuracy is substantial in assisting teachers to acquire the emotional situation, the project explores a series of FER methods and finds that few works engage in exploiting the orthogonality of convolutional matrix. Therefore, it enforces orthogonality on kernels by a regularizer, which extracts features with more diversity and expressiveness, and delivers OCFER-Net. Experiments are carried out on FER-2013, which is a challenging dataset. Results show superior performance over baselines by 1.087. The code of the research project is publicly available on https://github.com/YeeHoran/OCFERNet..
OCFER-Net:在线学习系统中的面部表情识别
最近,在线学习非常流行,特别是在全球流行的COVID-19下。除了知识的分配,情感的互动也很重要。它可以通过面部表情识别(FER)来获得。由于FER的准确性在帮助教师获取情绪情境方面具有重要意义,本项目探索了一系列的FER方法,发现很少有作品利用卷积矩阵的正交性。因此,它通过正则化器对核进行正交性强化,从而提取出更具多样性和表现力的特征,从而实现OCFER-Net。实验是在fer2013数据集上进行的,这是一个具有挑战性的数据集。结果显示,性能优于基线1.087。该研究项目的代码可在https://github.com/YeeHoran/OCFERNet上公开获取。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信