{"title":"Expression Complementary Disentanglement Network for Facial Expression Recognition","authors":"Shanmin Wang;Hui Shuai;Lei Zhu;Qingshan Liu","doi":"10.23919/cje.2022.00.351","DOIUrl":null,"url":null,"abstract":"Disentangling facial expressions from other disturbing facial attributes in face images is an essential topic for facial expression recognition. Previous methods only care about facial expression disentanglement (FED) itself, ignoring the negative effects of other facial attributes. Due to the annotations on limited facial attributes, it is difficult for existing FED solutions to disentangle all disturbance from the input face. To solve this issue, we propose an expression complementary disentanglement network (ECDNet). ECDNet proposes to finish the FED task during a face reconstruction process, so as to address all facial attributes during disentanglement. Different from traditional re-construction models, ECDNet reconstructs face images by progressively generating and combining facial appearance and matching geometry. It designs the expression incentive (EIE) and expression inhibition (EIN) mechanisms, inducing the model to characterize the disentangled expression and complementary parts precisely. Facial geometry and appearance, generated in the reconstructed process, are dealt with to represent facial expressions and complementary parts, respectively. The combination of distinctive reconstruction model, EIE, and EIN mechanisms ensures the completeness and exactness of the FED task. Experimental results on RAF-DB, AffectNet, and CAER-S datasets have proven the effectiveness and superiority of ECDNet.","PeriodicalId":50701,"journal":{"name":"Chinese Journal of Electronics","volume":"33 3","pages":"742-752"},"PeriodicalIF":1.6000,"publicationDate":"2024-03-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10543219","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Chinese Journal of Electronics","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10543219/","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0
Abstract
Disentangling facial expressions from other disturbing facial attributes in face images is an essential topic for facial expression recognition. Previous methods only care about facial expression disentanglement (FED) itself, ignoring the negative effects of other facial attributes. Due to the annotations on limited facial attributes, it is difficult for existing FED solutions to disentangle all disturbance from the input face. To solve this issue, we propose an expression complementary disentanglement network (ECDNet). ECDNet proposes to finish the FED task during a face reconstruction process, so as to address all facial attributes during disentanglement. Different from traditional re-construction models, ECDNet reconstructs face images by progressively generating and combining facial appearance and matching geometry. It designs the expression incentive (EIE) and expression inhibition (EIN) mechanisms, inducing the model to characterize the disentangled expression and complementary parts precisely. Facial geometry and appearance, generated in the reconstructed process, are dealt with to represent facial expressions and complementary parts, respectively. The combination of distinctive reconstruction model, EIE, and EIN mechanisms ensures the completeness and exactness of the FED task. Experimental results on RAF-DB, AffectNet, and CAER-S datasets have proven the effectiveness and superiority of ECDNet.
期刊介绍:
CJE focuses on the emerging fields of electronics, publishing innovative and transformative research papers. Most of the papers published in CJE are from universities and research institutes, presenting their innovative research results. Both theoretical and practical contributions are encouraged, and original research papers reporting novel solutions to the hot topics in electronics are strongly recommended.