Minghua Jiang , Zhangyuan Tian , Chenyu Yu , Yankang Shi , Li Liu , Tao Peng , Xinrong Hu , Feng Yu
{"title":"Intelligent 3D garment system of the human body based on deep spiking neural network","authors":"Minghua Jiang , Zhangyuan Tian , Chenyu Yu , Yankang Shi , Li Liu , Tao Peng , Xinrong Hu , Feng Yu","doi":"10.1016/j.vrih.2023.07.002","DOIUrl":null,"url":null,"abstract":"<div><h3>Background</h3><p>Intelligent garments, a burgeoning class of wearable devices, have extensive applications in domains such as sports training and medical rehabilitation. Nonetheless, existing research in the smart wearables domain predominantly emphasizes sensor functionality and quantity, often skipping crucial aspects related to user experience and interaction.</p></div><div><h3>Methods</h3><p>To address this gap, this study introduces a novel real-time 3D interactive system based on intelligent garments. The system utilizes lightweight sensor modules to collect human motion data and introduces a dual-stream fusion network based on pulsed neural units to classify and recognize human movements, thereby achieving real-time interaction between users and sensors. Additionally, the system in- corporates 3D human visualization functionality, which visualizes sensor data and recognizes human actions as 3D models in realtime, providing accurate and comprehensive visual feedback to help users better understand and analyze the details and features of human motion. This system has significant potential for applications in motion detection, medical monitoring, virtual reality, and other fields. The accurate classification of human actions con- tributes to the development of personalized training plans and injury prevention strategies.</p></div><div><h3>Conclusions</h3><p>This study has substantial implications in the domains of intelligent garments, human motion monitoring, and digital twin visualization. The advancement of this system is expected to propel the progress of wearable technology and foster a deeper comprehension of human motion.</p></div>","PeriodicalId":33538,"journal":{"name":"Virtual Reality Intelligent Hardware","volume":"6 1","pages":"Pages 43-55"},"PeriodicalIF":0.0000,"publicationDate":"2024-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S209657962300044X/pdf?md5=934866992a1e420fa2627cab1a89561d&pid=1-s2.0-S209657962300044X-main.pdf","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Virtual Reality Intelligent Hardware","FirstCategoryId":"1093","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S209657962300044X","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"Computer Science","Score":null,"Total":0}
引用次数: 0
Abstract
Background
Intelligent garments, a burgeoning class of wearable devices, have extensive applications in domains such as sports training and medical rehabilitation. Nonetheless, existing research in the smart wearables domain predominantly emphasizes sensor functionality and quantity, often skipping crucial aspects related to user experience and interaction.
Methods
To address this gap, this study introduces a novel real-time 3D interactive system based on intelligent garments. The system utilizes lightweight sensor modules to collect human motion data and introduces a dual-stream fusion network based on pulsed neural units to classify and recognize human movements, thereby achieving real-time interaction between users and sensors. Additionally, the system in- corporates 3D human visualization functionality, which visualizes sensor data and recognizes human actions as 3D models in realtime, providing accurate and comprehensive visual feedback to help users better understand and analyze the details and features of human motion. This system has significant potential for applications in motion detection, medical monitoring, virtual reality, and other fields. The accurate classification of human actions con- tributes to the development of personalized training plans and injury prevention strategies.
Conclusions
This study has substantial implications in the domains of intelligent garments, human motion monitoring, and digital twin visualization. The advancement of this system is expected to propel the progress of wearable technology and foster a deeper comprehension of human motion.