{"title":"利用野外以自我为中心的视频进行基于视频的个性化手分类学研究","authors":"Mehdy Dousty, David J Fleet, Jose Zariffa","doi":"10.1109/JBHI.2024.3495699","DOIUrl":null,"url":null,"abstract":"<p><strong>Objective: </strong>Hand function is central to inter- actions with our environment. Developing a comprehen- sive model of hand grasps in naturalistic environments is crucial across various disciplines, including robotics, ergonomics, and rehabilitation. Creating such a taxonomy poses challenges due to the significant variation in grasp- ing strategies that individuals may employ. For instance, individuals with impaired hands, such as those with spinal cord injuries (SCI), may develop unique grasps not used by unimpaired individuals. These grasping techniques may differ from person to person, influenced by variable senso- rimotor impairment, creating a need for personalized meth- ods of analysis.</p><p><strong>Method: </strong>This study aimed to automatically identify the dominant distinct hand grasps for each indi- vidual without reliance on a priori taxonomies, by applying semantic clustering to egocentric video. Egocentric video recordings collected in the homes of 19 individual with cervical SCI were used to cluster grasping actions with semantic significance. A deep learning model integrating posture and appearance data was employed to create a per- sonalized hand taxonomy.</p><p><strong>Results: </strong>Quantitative analysis reveals a cluster purity of 67.6% ± 24.2% with 18.0% ± 21.8% redundancy. Qualitative assessment revealed meaningful clusters in video content.</p><p><strong>Discussion: </strong>This methodology provides a flexible and effective strategy to analyze hand function in the wild, with applications in clinical assess- ment and in-depth characterization of human-environment interactions in a variety of contexts.</p>","PeriodicalId":13073,"journal":{"name":"IEEE Journal of Biomedical and Health Informatics","volume":"PP ","pages":""},"PeriodicalIF":6.7000,"publicationDate":"2024-11-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Personalized Video-Based Hand Taxonomy Using Egocentric Video in the Wild.\",\"authors\":\"Mehdy Dousty, David J Fleet, Jose Zariffa\",\"doi\":\"10.1109/JBHI.2024.3495699\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><strong>Objective: </strong>Hand function is central to inter- actions with our environment. Developing a comprehen- sive model of hand grasps in naturalistic environments is crucial across various disciplines, including robotics, ergonomics, and rehabilitation. Creating such a taxonomy poses challenges due to the significant variation in grasp- ing strategies that individuals may employ. For instance, individuals with impaired hands, such as those with spinal cord injuries (SCI), may develop unique grasps not used by unimpaired individuals. These grasping techniques may differ from person to person, influenced by variable senso- rimotor impairment, creating a need for personalized meth- ods of analysis.</p><p><strong>Method: </strong>This study aimed to automatically identify the dominant distinct hand grasps for each indi- vidual without reliance on a priori taxonomies, by applying semantic clustering to egocentric video. Egocentric video recordings collected in the homes of 19 individual with cervical SCI were used to cluster grasping actions with semantic significance. A deep learning model integrating posture and appearance data was employed to create a per- sonalized hand taxonomy.</p><p><strong>Results: </strong>Quantitative analysis reveals a cluster purity of 67.6% ± 24.2% with 18.0% ± 21.8% redundancy. Qualitative assessment revealed meaningful clusters in video content.</p><p><strong>Discussion: </strong>This methodology provides a flexible and effective strategy to analyze hand function in the wild, with applications in clinical assess- ment and in-depth characterization of human-environment interactions in a variety of contexts.</p>\",\"PeriodicalId\":13073,\"journal\":{\"name\":\"IEEE Journal of Biomedical and Health Informatics\",\"volume\":\"PP \",\"pages\":\"\"},\"PeriodicalIF\":6.7000,\"publicationDate\":\"2024-11-11\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Journal of Biomedical and Health Informatics\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://doi.org/10.1109/JBHI.2024.3495699\",\"RegionNum\":2,\"RegionCategory\":\"医学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, INFORMATION SYSTEMS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Journal of Biomedical and Health Informatics","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.1109/JBHI.2024.3495699","RegionNum":2,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
Personalized Video-Based Hand Taxonomy Using Egocentric Video in the Wild.
Objective: Hand function is central to inter- actions with our environment. Developing a comprehen- sive model of hand grasps in naturalistic environments is crucial across various disciplines, including robotics, ergonomics, and rehabilitation. Creating such a taxonomy poses challenges due to the significant variation in grasp- ing strategies that individuals may employ. For instance, individuals with impaired hands, such as those with spinal cord injuries (SCI), may develop unique grasps not used by unimpaired individuals. These grasping techniques may differ from person to person, influenced by variable senso- rimotor impairment, creating a need for personalized meth- ods of analysis.
Method: This study aimed to automatically identify the dominant distinct hand grasps for each indi- vidual without reliance on a priori taxonomies, by applying semantic clustering to egocentric video. Egocentric video recordings collected in the homes of 19 individual with cervical SCI were used to cluster grasping actions with semantic significance. A deep learning model integrating posture and appearance data was employed to create a per- sonalized hand taxonomy.
Results: Quantitative analysis reveals a cluster purity of 67.6% ± 24.2% with 18.0% ± 21.8% redundancy. Qualitative assessment revealed meaningful clusters in video content.
Discussion: This methodology provides a flexible and effective strategy to analyze hand function in the wild, with applications in clinical assess- ment and in-depth characterization of human-environment interactions in a variety of contexts.
期刊介绍:
IEEE Journal of Biomedical and Health Informatics publishes original papers presenting recent advances where information and communication technologies intersect with health, healthcare, life sciences, and biomedicine. Topics include acquisition, transmission, storage, retrieval, management, and analysis of biomedical and health information. The journal covers applications of information technologies in healthcare, patient monitoring, preventive care, early disease diagnosis, therapy discovery, and personalized treatment protocols. It explores electronic medical and health records, clinical information systems, decision support systems, medical and biological imaging informatics, wearable systems, body area/sensor networks, and more. Integration-related topics like interoperability, evidence-based medicine, and secure patient data are also addressed.