Hongbo Li, Yifu Ji, Lingxiang Xu, Jiaoyun Yang, Yang Du, Min Hu, Ning An
{"title":"Using Multimodal Data Collection System as a Research Tool in the Major Depressive Disorder Analysis: a cross-sectional study protocol","authors":"Hongbo Li, Yifu Ji, Lingxiang Xu, Jiaoyun Yang, Yang Du, Min Hu, Ning An","doi":"10.1101/2024.07.21.24310061","DOIUrl":null,"url":null,"abstract":"ABSTRACT Introduction Previous studies have established that depressive syndromes can be detected using machine learning methods, with multimodal data being essential. Multimodal data facilitates the extraction of characteristics such as gaze tracking, a reliable depression indicator. Our study employs high-quality video and other multimodal data from patients diagnosed with depression. Our study uses a multimodal data collection system (MDC) to understand the complex indicators of depression.\nObjective This paper outlines our protocol for deploying a multimodal data collection system within an In-Person Clinical Assessment environment. The system gathers high-definition videos, real-time vital signs, and voice recordings for future extraction of critical information such as eye gaze patterns. We aim to scale our model to provide portable depression risk analyses, facilitating timely intervention and encouraging patients to seek professional assistance.\nMethods and Analysis We have conducted sessions with 70 participants diagnosed with depression. Each participant undergoes DSM-5 interviews and engages with our multimodal data collection system. Participants respond to five on-screen scales while being recorded. To our knowledge, no other protocol has combined multimodal data collection and various stimuli in depression data collection.","PeriodicalId":501388,"journal":{"name":"medRxiv - Psychiatry and Clinical Psychology","volume":"40 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-07-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"medRxiv - Psychiatry and Clinical Psychology","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1101/2024.07.21.24310061","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
ABSTRACT Introduction Previous studies have established that depressive syndromes can be detected using machine learning methods, with multimodal data being essential. Multimodal data facilitates the extraction of characteristics such as gaze tracking, a reliable depression indicator. Our study employs high-quality video and other multimodal data from patients diagnosed with depression. Our study uses a multimodal data collection system (MDC) to understand the complex indicators of depression.
Objective This paper outlines our protocol for deploying a multimodal data collection system within an In-Person Clinical Assessment environment. The system gathers high-definition videos, real-time vital signs, and voice recordings for future extraction of critical information such as eye gaze patterns. We aim to scale our model to provide portable depression risk analyses, facilitating timely intervention and encouraging patients to seek professional assistance.
Methods and Analysis We have conducted sessions with 70 participants diagnosed with depression. Each participant undergoes DSM-5 interviews and engages with our multimodal data collection system. Participants respond to five on-screen scales while being recorded. To our knowledge, no other protocol has combined multimodal data collection and various stimuli in depression data collection.