Chi Zhang , Lei Zhang , Yu Tian , Zhengang An, Bo Li, Dachao Li
{"title":"AI-enabled full-body dynamic avatar reconstruction using triboelectric smart clothing for metaverse applications","authors":"Chi Zhang , Lei Zhang , Yu Tian , Zhengang An, Bo Li, Dachao Li","doi":"10.1016/j.esci.2025.100373","DOIUrl":null,"url":null,"abstract":"<div><div>Full-body avatar reconstruction offers users immersive and interactive experiences in virtual space, which are crucial for the advancement of metaverse applications. However, traditional hardware solutions, reliant on optical cameras or inertial sensors, are hampered by privacy concerns, spatial limitations, high costs, and calibration challenges. Here, we propose AI-enabled smart clothing that seamlessly integrates triboelectric strain-sensing fibers (TSSFs) and AI algorithms with commercial fitness suits to achieve precise dynamic 3D reconstruction of body movement. TSSFs enable the dynamic capture of body postures and excel in sensitivity, linearity, and strain range, while maintaining mechanical stability, temperature resilience, and washability. The integrated algorithms accurately decouple posture signals — distinguishing between similar postures with the 1D-CNN algorithm, compensating for body-shape differences via a calibration algorithm, and determining spatial elements for avatar reconstruction using a decision-tree algorithm. Finally, leveraging Unity-3D, we achieve ultra-accurate dynamic 3D avatars with a joint angle error of <3.63° and demonstrate their effectiveness using VR fitness and entertainment applications, showing how they can offer users standardized yet engaging experiences.</div></div>","PeriodicalId":100489,"journal":{"name":"eScience","volume":"5 4","pages":"Article 100373"},"PeriodicalIF":42.9000,"publicationDate":"2025-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"eScience","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2667141725000035","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ELECTROCHEMISTRY","Score":null,"Total":0}
引用次数: 0
Abstract
Full-body avatar reconstruction offers users immersive and interactive experiences in virtual space, which are crucial for the advancement of metaverse applications. However, traditional hardware solutions, reliant on optical cameras or inertial sensors, are hampered by privacy concerns, spatial limitations, high costs, and calibration challenges. Here, we propose AI-enabled smart clothing that seamlessly integrates triboelectric strain-sensing fibers (TSSFs) and AI algorithms with commercial fitness suits to achieve precise dynamic 3D reconstruction of body movement. TSSFs enable the dynamic capture of body postures and excel in sensitivity, linearity, and strain range, while maintaining mechanical stability, temperature resilience, and washability. The integrated algorithms accurately decouple posture signals — distinguishing between similar postures with the 1D-CNN algorithm, compensating for body-shape differences via a calibration algorithm, and determining spatial elements for avatar reconstruction using a decision-tree algorithm. Finally, leveraging Unity-3D, we achieve ultra-accurate dynamic 3D avatars with a joint angle error of <3.63° and demonstrate their effectiveness using VR fitness and entertainment applications, showing how they can offer users standardized yet engaging experiences.