Multimodal depression recognition and analysis: Facial expression and body posture changes via emotional stimuli

IF 4.9 2区 医学 Q1 CLINICAL NEUROLOGY
Yang Liu , Xingyun Li , Mengqi Wang , Jianlu Bi , Shaoqin Lin , Qingxiang Wang , Yanhong Yu , Jiayu Ye , Yunshao Zheng
{"title":"Multimodal depression recognition and analysis: Facial expression and body posture changes via emotional stimuli","authors":"Yang Liu ,&nbsp;Xingyun Li ,&nbsp;Mengqi Wang ,&nbsp;Jianlu Bi ,&nbsp;Shaoqin Lin ,&nbsp;Qingxiang Wang ,&nbsp;Yanhong Yu ,&nbsp;Jiayu Ye ,&nbsp;Yunshao Zheng","doi":"10.1016/j.jad.2025.03.155","DOIUrl":null,"url":null,"abstract":"<div><h3>Background</h3><div>Clinical studies have shown that facial expressions and body posture in depressed patients differ significantly from those of healthy individuals. Combining relevant behavioral features with artificial intelligence technology can effectively improve the efficiency of depression detection, thereby assisting doctors in early identification of patients. This study aims to develop an end-to-end multimodal recognition model combining facial expressions and body posture via deep learning techniques, enabling rapid preliminary screening of depression.</div></div><div><h3>Methods</h3><div>We invited 146 subjects (73 in the patient group and 73 in the control group) to participate in an emotion-stimulus experiment for depression recognition. We focused on differentiating depression patients from the control group by analyzing changes in body posture and facial expressions under emotional stimuli. We first extracted images of body position and facial emotions from the video, then used a pre-trained ResNet-50 network to extract features. Additionally, we analyzed facial expression features using OpenFace for sequence analysis. Subsequently, various deep learning frameworks were combined to assess the severity of depression.</div></div><div><h3>Results</h3><div>We found that under different stimuli, facial expression units AU04, AU07, AU10, AU12, AU17, and AU26 had significant effects in the emotion-stimulus experiment, with these features generally being negative. The decision-level fusion model based on facial expressions and body posture achieved excellent results, with the highest accuracy of 0.904 and an F1 score of 0.901.</div></div><div><h3>Conclusions</h3><div>The experimental results suggest that depression patients exhibit predominantly negative facial expressions. This study validates the emotion-stimulus experiment, demonstrating that combining facial expressions and body posture enables accurate preliminary depression screening.</div></div>","PeriodicalId":14963,"journal":{"name":"Journal of affective disorders","volume":"381 ","pages":""},"PeriodicalIF":4.9000,"publicationDate":"2025-04-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of affective disorders","FirstCategoryId":"3","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0165032725005038","RegionNum":2,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"CLINICAL NEUROLOGY","Score":null,"Total":0}
引用次数: 0

Abstract

Background

Clinical studies have shown that facial expressions and body posture in depressed patients differ significantly from those of healthy individuals. Combining relevant behavioral features with artificial intelligence technology can effectively improve the efficiency of depression detection, thereby assisting doctors in early identification of patients. This study aims to develop an end-to-end multimodal recognition model combining facial expressions and body posture via deep learning techniques, enabling rapid preliminary screening of depression.

Methods

We invited 146 subjects (73 in the patient group and 73 in the control group) to participate in an emotion-stimulus experiment for depression recognition. We focused on differentiating depression patients from the control group by analyzing changes in body posture and facial expressions under emotional stimuli. We first extracted images of body position and facial emotions from the video, then used a pre-trained ResNet-50 network to extract features. Additionally, we analyzed facial expression features using OpenFace for sequence analysis. Subsequently, various deep learning frameworks were combined to assess the severity of depression.

Results

We found that under different stimuli, facial expression units AU04, AU07, AU10, AU12, AU17, and AU26 had significant effects in the emotion-stimulus experiment, with these features generally being negative. The decision-level fusion model based on facial expressions and body posture achieved excellent results, with the highest accuracy of 0.904 and an F1 score of 0.901.

Conclusions

The experimental results suggest that depression patients exhibit predominantly negative facial expressions. This study validates the emotion-stimulus experiment, demonstrating that combining facial expressions and body posture enables accurate preliminary depression screening.
多模态抑郁症的识别与分析:情绪刺激下的面部表情和身体姿势变化
临床研究表明,抑郁症患者的面部表情和身体姿势与健康人有显著差异。将相关行为特征与人工智能技术相结合,可以有效提高抑郁症检测的效率,从而协助医生早期识别患者。本研究旨在通过深度学习技术,开发一种结合面部表情和身体姿势的端到端多模态识别模型,实现对抑郁症的快速初步筛查。方法采用情绪刺激法对146名受试者进行抑郁识别实验,其中患者组73人,对照组73人。我们主要通过分析情绪刺激下身体姿势和面部表情的变化来区分抑郁症患者和对照组。我们首先从视频中提取身体位置和面部情绪图像,然后使用预训练的ResNet-50网络提取特征。此外,我们使用OpenFace分析面部表情特征进行序列分析。随后,将各种深度学习框架结合起来评估抑郁症的严重程度。结果在不同刺激下,面部表情单元AU04、AU07、AU10、AU12、AU17和AU26在情绪刺激实验中具有显著的影响,且这些特征普遍为负向。基于面部表情和身体姿态的决策级融合模型取得了优异的效果,最高准确率为0.904,F1得分为0.901。结论抑郁症患者的面部表情以消极为主。本研究验证了情绪刺激实验,表明结合面部表情和身体姿势可以准确地初步筛选抑郁症。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Journal of affective disorders
Journal of affective disorders 医学-精神病学
CiteScore
10.90
自引率
6.10%
发文量
1319
审稿时长
9.3 weeks
期刊介绍: The Journal of Affective Disorders publishes papers concerned with affective disorders in the widest sense: depression, mania, mood spectrum, emotions and personality, anxiety and stress. It is interdisciplinary and aims to bring together different approaches for a diverse readership. Top quality papers will be accepted dealing with any aspect of affective disorders, including neuroimaging, cognitive neurosciences, genetics, molecular biology, experimental and clinical neurosciences, pharmacology, neuroimmunoendocrinology, intervention and treatment trials.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信