基于CNN-GRU运动活动深度学习模型分析精神障碍。

IF 3.9 3区 工程技术 Q2 NEUROSCIENCES
Cognitive Neurodynamics Pub Date : 2025-12-01 Epub Date: 2025-09-15 DOI:10.1007/s11571-025-10335-w
Umang Gupta, Partha Sarathi Bishnu, Abhishek Kumar, Anuj Kumar Pandey, Biresh Kumar, Preeti Kumari
{"title":"基于CNN-GRU运动活动深度学习模型分析精神障碍。","authors":"Umang Gupta, Partha Sarathi Bishnu, Abhishek Kumar, Anuj Kumar Pandey, Biresh Kumar, Preeti Kumari","doi":"10.1007/s11571-025-10335-w","DOIUrl":null,"url":null,"abstract":"<p><p>Mood disorders can significantly interfere with daily life, ranging from mild to severe, impacting relationships, work, and overall well-being. Globally, the scarcity of mental health resources and the stigma attached to mental illness are significant obstacles. Existing approaches for mood disorder detection often rely on static clinical data or other modalities (e.g., imaging or questionnaires), and the potential of continuous motor activity data remains underexplored. Continuous wearable motor activity recordings represent an objective, non-invasive method that tracks an individual's behavioral patterns relevant to their mood states, while enabling ongoing monitoring in contrast to the episodic clinical assessments. Our primary goal in this paper is to employ a Deep Learning Model utilizing CNN-GRU architecture for analyzing motor activity sequences. Through rigorous experimentation on Depresjon datasets recorded via wrist worn actigraphy, our approach achieves an accuracy of 98.1%, surpassing the accuracy levels achieved by state-of-the-art techniques.</p>","PeriodicalId":10500,"journal":{"name":"Cognitive Neurodynamics","volume":"19 1","pages":"147"},"PeriodicalIF":3.9000,"publicationDate":"2025-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12436267/pdf/","citationCount":"0","resultStr":"{\"title\":\"Analyzing mental disorders with a CNN-GRU deep learning model on motor activity.\",\"authors\":\"Umang Gupta, Partha Sarathi Bishnu, Abhishek Kumar, Anuj Kumar Pandey, Biresh Kumar, Preeti Kumari\",\"doi\":\"10.1007/s11571-025-10335-w\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>Mood disorders can significantly interfere with daily life, ranging from mild to severe, impacting relationships, work, and overall well-being. Globally, the scarcity of mental health resources and the stigma attached to mental illness are significant obstacles. Existing approaches for mood disorder detection often rely on static clinical data or other modalities (e.g., imaging or questionnaires), and the potential of continuous motor activity data remains underexplored. Continuous wearable motor activity recordings represent an objective, non-invasive method that tracks an individual's behavioral patterns relevant to their mood states, while enabling ongoing monitoring in contrast to the episodic clinical assessments. Our primary goal in this paper is to employ a Deep Learning Model utilizing CNN-GRU architecture for analyzing motor activity sequences. Through rigorous experimentation on Depresjon datasets recorded via wrist worn actigraphy, our approach achieves an accuracy of 98.1%, surpassing the accuracy levels achieved by state-of-the-art techniques.</p>\",\"PeriodicalId\":10500,\"journal\":{\"name\":\"Cognitive Neurodynamics\",\"volume\":\"19 1\",\"pages\":\"147\"},\"PeriodicalIF\":3.9000,\"publicationDate\":\"2025-12-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12436267/pdf/\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Cognitive Neurodynamics\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://doi.org/10.1007/s11571-025-10335-w\",\"RegionNum\":3,\"RegionCategory\":\"工程技术\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"2025/9/15 0:00:00\",\"PubModel\":\"Epub\",\"JCR\":\"Q2\",\"JCRName\":\"NEUROSCIENCES\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Cognitive Neurodynamics","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.1007/s11571-025-10335-w","RegionNum":3,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2025/9/15 0:00:00","PubModel":"Epub","JCR":"Q2","JCRName":"NEUROSCIENCES","Score":null,"Total":0}
引用次数: 0

摘要

情绪障碍会显著干扰日常生活,从轻微到严重,影响人际关系、工作和整体幸福感。在全球范围内,精神卫生资源的匮乏和对精神疾病的污名化是重大障碍。现有的情绪障碍检测方法通常依赖于静态临床数据或其他模式(例如,成像或问卷调查),并且持续运动活动数据的潜力仍未得到充分开发。连续可穿戴的运动活动记录代表了一种客观的、非侵入性的方法,可以跟踪与个人情绪状态相关的行为模式,同时与偶发性临床评估相比,可以进行持续监测。本文的主要目标是使用CNN-GRU架构的深度学习模型来分析运动活动序列。通过在手腕上的活动记录仪记录的洼地数据集上进行严格的实验,我们的方法达到了98.1%的准确率,超过了目前最先进的技术所达到的准确率水平。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Analyzing mental disorders with a CNN-GRU deep learning model on motor activity.

Mood disorders can significantly interfere with daily life, ranging from mild to severe, impacting relationships, work, and overall well-being. Globally, the scarcity of mental health resources and the stigma attached to mental illness are significant obstacles. Existing approaches for mood disorder detection often rely on static clinical data or other modalities (e.g., imaging or questionnaires), and the potential of continuous motor activity data remains underexplored. Continuous wearable motor activity recordings represent an objective, non-invasive method that tracks an individual's behavioral patterns relevant to their mood states, while enabling ongoing monitoring in contrast to the episodic clinical assessments. Our primary goal in this paper is to employ a Deep Learning Model utilizing CNN-GRU architecture for analyzing motor activity sequences. Through rigorous experimentation on Depresjon datasets recorded via wrist worn actigraphy, our approach achieves an accuracy of 98.1%, surpassing the accuracy levels achieved by state-of-the-art techniques.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Cognitive Neurodynamics
Cognitive Neurodynamics 医学-神经科学
CiteScore
6.90
自引率
18.90%
发文量
140
审稿时长
12 months
期刊介绍: Cognitive Neurodynamics provides a unique forum of communication and cooperation for scientists and engineers working in the field of cognitive neurodynamics, intelligent science and applications, bridging the gap between theory and application, without any preference for pure theoretical, experimental or computational models. The emphasis is to publish original models of cognitive neurodynamics, novel computational theories and experimental results. In particular, intelligent science inspired by cognitive neuroscience and neurodynamics is also very welcome. The scope of Cognitive Neurodynamics covers cognitive neuroscience, neural computation based on dynamics, computer science, intelligent science as well as their interdisciplinary applications in the natural and engineering sciences. Papers that are appropriate for non-specialist readers are encouraged. 1. There is no page limit for manuscripts submitted to Cognitive Neurodynamics. Research papers should clearly represent an important advance of especially broad interest to researchers and technologists in neuroscience, biophysics, BCI, neural computer and intelligent robotics. 2. Cognitive Neurodynamics also welcomes brief communications: short papers reporting results that are of genuinely broad interest but that for one reason and another do not make a sufficiently complete story to justify a full article publication. Brief Communications should consist of approximately four manuscript pages. 3. Cognitive Neurodynamics publishes review articles in which a specific field is reviewed through an exhaustive literature survey. There are no restrictions on the number of pages. Review articles are usually invited, but submitted reviews will also be considered.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信