Facial Landmark-Driven Keypoint Feature Extraction for Robust Facial Expression Recognition.

IF 3.4 3区 综合性期刊 Q2 CHEMISTRY, ANALYTICAL
Sensors Pub Date : 2025-06-16 DOI:10.3390/s25123762
Jaehyun So, Youngjoon Han
{"title":"Facial Landmark-Driven Keypoint Feature Extraction for Robust Facial Expression Recognition.","authors":"Jaehyun So, Youngjoon Han","doi":"10.3390/s25123762","DOIUrl":null,"url":null,"abstract":"<p><p>Facial expression recognition (FER) is a core technology that enables computers to understand and react to human emotions. In particular, the use of face alignment algorithms as a preprocessing step in image-based FER is important for accurately normalizing face images in terms of scale, rotation, and translation to improve FER accuracy. Recently, FER studies have been actively leveraging feature maps computed by face alignment networks to enhance FER performance. However, previous studies were limited in their ability to effectively apply information from specific facial regions that are important for FER, as they either only used facial landmarks during the preprocessing step or relied solely on the feature maps from the face alignment networks. In this paper, we propose the use of Keypoint Features extracted from feature maps at the coordinates of facial landmarks. To effectively utilize Keypoint Features, we further propose a Keypoint Feature regularization method using landmark perturbation for robustness, and an attention mechanism that emphasizes all Keypoint Features using representative Keypoint Features derived from a nasal base landmark, which carries information for the whole face, to improve performance. We performed experiments on the AffectNet, RAF-DB, and FERPlus datasets using a simply designed network to validate the effectiveness of the proposed method. As a result, the proposed method achieved a performance of 68.17% on AffectNet-7, 64.87% on AffectNet-8, 93.16% on RAF-DB, and 91.44% on FERPlus. Furthermore, the network pretrained on AffectNet-8 had improved performances of 94.04% on RAF-DB and 91.66% on FERPlus. These results demonstrate that the proposed Keypoint Features can achieve comparable results to those of the existing methods, highlighting their potential for enhancing FER performance through the effective utilization of key facial region features.</p>","PeriodicalId":21698,"journal":{"name":"Sensors","volume":"25 12","pages":""},"PeriodicalIF":3.4000,"publicationDate":"2025-06-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Sensors","FirstCategoryId":"103","ListUrlMain":"https://doi.org/10.3390/s25123762","RegionNum":3,"RegionCategory":"综合性期刊","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"CHEMISTRY, ANALYTICAL","Score":null,"Total":0}
引用次数: 0

Abstract

Facial expression recognition (FER) is a core technology that enables computers to understand and react to human emotions. In particular, the use of face alignment algorithms as a preprocessing step in image-based FER is important for accurately normalizing face images in terms of scale, rotation, and translation to improve FER accuracy. Recently, FER studies have been actively leveraging feature maps computed by face alignment networks to enhance FER performance. However, previous studies were limited in their ability to effectively apply information from specific facial regions that are important for FER, as they either only used facial landmarks during the preprocessing step or relied solely on the feature maps from the face alignment networks. In this paper, we propose the use of Keypoint Features extracted from feature maps at the coordinates of facial landmarks. To effectively utilize Keypoint Features, we further propose a Keypoint Feature regularization method using landmark perturbation for robustness, and an attention mechanism that emphasizes all Keypoint Features using representative Keypoint Features derived from a nasal base landmark, which carries information for the whole face, to improve performance. We performed experiments on the AffectNet, RAF-DB, and FERPlus datasets using a simply designed network to validate the effectiveness of the proposed method. As a result, the proposed method achieved a performance of 68.17% on AffectNet-7, 64.87% on AffectNet-8, 93.16% on RAF-DB, and 91.44% on FERPlus. Furthermore, the network pretrained on AffectNet-8 had improved performances of 94.04% on RAF-DB and 91.66% on FERPlus. These results demonstrate that the proposed Keypoint Features can achieve comparable results to those of the existing methods, highlighting their potential for enhancing FER performance through the effective utilization of key facial region features.

人脸地标驱动的关键点特征提取用于鲁棒人脸表情识别。
面部表情识别(FER)是一项使计算机能够理解人类情绪并做出反应的核心技术。特别是,在基于图像的FER中,使用人脸对齐算法作为预处理步骤对于在比例、旋转和平移方面准确规范化人脸图像以提高FER精度非常重要。近年来,人脸识别研究一直在积极利用人脸对齐网络计算的特征映射来提高人脸识别性能。然而,先前的研究在有效应用特定面部区域的信息方面受到限制,因为它们要么在预处理步骤中仅使用面部地标,要么仅依赖面部对齐网络的特征映射。在本文中,我们提出使用从特征映射中提取的关键点特征在面部地标的坐标上。为了有效地利用关键点特征,我们进一步提出了一种使用地标扰动的关键点特征正则化方法来增强鲁棒性,并提出了一种注意机制,该机制使用来自鼻基底地标的代表性关键点特征来强调所有关键点特征,该关键点特征携带整个面部的信息,以提高性能。我们使用一个简单设计的网络在AffectNet、RAF-DB和FERPlus数据集上进行了实验,以验证所提出方法的有效性。结果表明,该方法在AffectNet-7、AffectNet-8、RAF-DB和FERPlus上的性能分别为68.17%、64.87%、93.16%和91.44%。此外,在AffectNet-8上预训练的网络在RAF-DB和FERPlus上的性能分别提高了94.04%和91.66%。这些结果表明,所提出的关键点特征可以达到与现有方法相当的结果,突出了它们通过有效利用关键面部区域特征来提高FER性能的潜力。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Sensors
Sensors 工程技术-电化学
CiteScore
7.30
自引率
12.80%
发文量
8430
审稿时长
1.7 months
期刊介绍: Sensors (ISSN 1424-8220) provides an advanced forum for the science and technology of sensors and biosensors. It publishes reviews (including comprehensive reviews on the complete sensors products), regular research papers and short notes. Our aim is to encourage scientists to publish their experimental and theoretical results in as much detail as possible. There is no restriction on the length of the papers. The full experimental details must be provided so that the results can be reproduced.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信