An anxiety screening framework integrating multimodal data and graph node correlation

IF 6.2 2区 医学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Haimiao Mo , Hongjia Wu , Qian Rong , Zhijian Hu , Meng Yi , Peipei Chen
{"title":"An anxiety screening framework integrating multimodal data and graph node correlation","authors":"Haimiao Mo ,&nbsp;Hongjia Wu ,&nbsp;Qian Rong ,&nbsp;Zhijian Hu ,&nbsp;Meng Yi ,&nbsp;Peipei Chen","doi":"10.1016/j.artmed.2025.103189","DOIUrl":null,"url":null,"abstract":"<div><div>Anxiety disorders are a significant global health concern, profoundly impacting patients’ lives and social functioning while imposing considerable burdens on families and economies. However, current anxiety screening methods face limitations due to cost constraints and cognitive biases, particularly in their inability to deeply model correlations among multidimensional features. They often overlook crucial information inherent in their internal couplings, limiting their accuracy and applicability in clinical diagnostics. To address these challenges, we propose an advanced anxiety screening framework that integrates multimodal data, such as physiological, behavioral, audio, and textual, using a Graph Convolutional Network (GCN). While our framework draws upon existing technologies such as GCN, one-dimensional convolutional neural networks, and gated recurrent units, the uniqueness of our framework lies in how these components are combined to capture complex spatiotemporal relationships and correlations among multimodal features. Experimental results demonstrate the framework’s robust performance, achieving an accuracy of 93.48%, Area Under Curve of 94.58%, precision of 90.00%, sensitivity of 81.82%, specificity of 97.14%, F1 score of 85.71%. Notably, the method remains effective even when questionnaire data is unavailable, underscoring its practicality and reliability. This anxiety screening approach provides a new perspective for early identification and intervention of anxiety symptoms, offering a scientific basis for personalized treatment and prevention through the analysis of multimodal data and graph structures.</div></div>","PeriodicalId":55458,"journal":{"name":"Artificial Intelligence in Medicine","volume":"167 ","pages":"Article 103189"},"PeriodicalIF":6.2000,"publicationDate":"2025-06-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Artificial Intelligence in Medicine","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0933365725001241","RegionNum":2,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

Abstract

Anxiety disorders are a significant global health concern, profoundly impacting patients’ lives and social functioning while imposing considerable burdens on families and economies. However, current anxiety screening methods face limitations due to cost constraints and cognitive biases, particularly in their inability to deeply model correlations among multidimensional features. They often overlook crucial information inherent in their internal couplings, limiting their accuracy and applicability in clinical diagnostics. To address these challenges, we propose an advanced anxiety screening framework that integrates multimodal data, such as physiological, behavioral, audio, and textual, using a Graph Convolutional Network (GCN). While our framework draws upon existing technologies such as GCN, one-dimensional convolutional neural networks, and gated recurrent units, the uniqueness of our framework lies in how these components are combined to capture complex spatiotemporal relationships and correlations among multimodal features. Experimental results demonstrate the framework’s robust performance, achieving an accuracy of 93.48%, Area Under Curve of 94.58%, precision of 90.00%, sensitivity of 81.82%, specificity of 97.14%, F1 score of 85.71%. Notably, the method remains effective even when questionnaire data is unavailable, underscoring its practicality and reliability. This anxiety screening approach provides a new perspective for early identification and intervention of anxiety symptoms, offering a scientific basis for personalized treatment and prevention through the analysis of multimodal data and graph structures.
一个整合多模态数据和图节点关联的焦虑筛选框架
焦虑症是一个重大的全球健康问题,深刻影响患者的生活和社会功能,同时给家庭和经济造成相当大的负担。然而,由于成本限制和认知偏差,目前的焦虑筛查方法面临局限性,特别是无法对多维特征之间的相关性进行深度建模。它们常常忽略了内在耦合的关键信息,限制了它们在临床诊断中的准确性和适用性。为了应对这些挑战,我们提出了一个先进的焦虑筛查框架,该框架使用图卷积网络(GCN)集成了多模态数据,如生理、行为、音频和文本。虽然我们的框架借鉴了现有的技术,如GCN、一维卷积神经网络和门控循环单元,但我们框架的独特性在于如何将这些组件组合起来,以捕获多模态特征之间复杂的时空关系和相关性。实验结果表明,该框架具有良好的鲁棒性,准确率为93.48%,曲线下面积为94.58%,精密度为90.00%,灵敏度为81.82%,特异性为97.14%,F1评分为85.71%。值得注意的是,即使在没有问卷数据的情况下,该方法仍然有效,强调了其实用性和可靠性。这种焦虑筛查方法为焦虑症状的早期识别和干预提供了新的视角,通过对多模态数据和图结构的分析,为个性化治疗和预防提供了科学依据。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Artificial Intelligence in Medicine
Artificial Intelligence in Medicine 工程技术-工程:生物医学
CiteScore
15.00
自引率
2.70%
发文量
143
审稿时长
6.3 months
期刊介绍: Artificial Intelligence in Medicine publishes original articles from a wide variety of interdisciplinary perspectives concerning the theory and practice of artificial intelligence (AI) in medicine, medically-oriented human biology, and health care. Artificial intelligence in medicine may be characterized as the scientific discipline pertaining to research studies, projects, and applications that aim at supporting decision-based medical tasks through knowledge- and/or data-intensive computer-based solutions that ultimately support and improve the performance of a human care provider.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信