Ryan T F Casey, Christoph P O Nuesslein, Felicia Davenport, Jason Wheeler, Anirban Mazumdar, Gregory Sawicki, Aaron J Young
{"title":"The Second Skin: A Wearable Sensor Suite that Enables Real-Time Human Biomechanics Tracking Through Deep Learning.","authors":"Ryan T F Casey, Christoph P O Nuesslein, Felicia Davenport, Jason Wheeler, Anirban Mazumdar, Gregory Sawicki, Aaron J Young","doi":"10.1109/TBME.2025.3589996","DOIUrl":null,"url":null,"abstract":"<p><strong>Objective: </strong>Real-time determination of human kinematics and kinetics could advance biomechanics research and enable valuable applications of biofeedback and generalizable exoskeleton control. This work aims to investigate a taskindependent, user-independent method for obtaining precise realtime joint state estimation across lower-body joints during a wide variety of tasks.</p><p><strong>Methods: </strong>We developed a generalizable sensing approach using a suit comprised of inertial measurement units (IMUs) and pressure insoles. With the suit, we collected a dataset of 33 tasks commonly performed during construction and hazardous waste cleanup (N = 10). We then trained deep learning user-independent, task-agnostic models to estimate joint lowerbody kinematics and dynamics using only worn sensor data. We likewise computed joint kinematics and dynamics analytically from sensor data to serve as a comparison tool for model results.</p><p><strong>Results: </strong>Our models achieved overall angle estimation root-meansquared-errors (RMSE) of 6.56±.92°, 8.60±1.01°, 7.58±.89°, and 6.00±.73° compared to 13.9±.1.3°, 15.31±1.0°, 10.76±.70°, and 7.56±.48° via analytical methods at the lower back, hip, knee, and ankle, respectively. Likewise, our models achieved overall normalized moment estimation RMSEs of .207±.069 Nm/kg, .242±.044 Nm/kg, .202±.038 Nm/kg, and .193±.034 Nm/kg compared to .306±.036 Nm/kg, .407±.021 Nm/kg, 1.18 ±.022 Nm/kg, and 1.73±.071 Nm/kg via analytical methods at the lower back, hip, knee, and ankle, respectively.</p><p><strong>Conclusion: </strong>These results are comparable to other state-of-the-art wearable sensing systems, establishing deep learning as a viable sensing approach that generalizes to new users and tasks.</p><p><strong>Significance: </strong>This work shows promise for enabling accurate real-world biomechanical data collection and enhancement of biofeedback systems and wearable robot control.</p>","PeriodicalId":13245,"journal":{"name":"IEEE Transactions on Biomedical Engineering","volume":"PP ","pages":""},"PeriodicalIF":4.5000,"publicationDate":"2025-07-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Biomedical Engineering","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.1109/TBME.2025.3589996","RegionNum":2,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENGINEERING, BIOMEDICAL","Score":null,"Total":0}
引用次数: 0
Abstract
Objective: Real-time determination of human kinematics and kinetics could advance biomechanics research and enable valuable applications of biofeedback and generalizable exoskeleton control. This work aims to investigate a taskindependent, user-independent method for obtaining precise realtime joint state estimation across lower-body joints during a wide variety of tasks.
Methods: We developed a generalizable sensing approach using a suit comprised of inertial measurement units (IMUs) and pressure insoles. With the suit, we collected a dataset of 33 tasks commonly performed during construction and hazardous waste cleanup (N = 10). We then trained deep learning user-independent, task-agnostic models to estimate joint lowerbody kinematics and dynamics using only worn sensor data. We likewise computed joint kinematics and dynamics analytically from sensor data to serve as a comparison tool for model results.
Results: Our models achieved overall angle estimation root-meansquared-errors (RMSE) of 6.56±.92°, 8.60±1.01°, 7.58±.89°, and 6.00±.73° compared to 13.9±.1.3°, 15.31±1.0°, 10.76±.70°, and 7.56±.48° via analytical methods at the lower back, hip, knee, and ankle, respectively. Likewise, our models achieved overall normalized moment estimation RMSEs of .207±.069 Nm/kg, .242±.044 Nm/kg, .202±.038 Nm/kg, and .193±.034 Nm/kg compared to .306±.036 Nm/kg, .407±.021 Nm/kg, 1.18 ±.022 Nm/kg, and 1.73±.071 Nm/kg via analytical methods at the lower back, hip, knee, and ankle, respectively.
Conclusion: These results are comparable to other state-of-the-art wearable sensing systems, establishing deep learning as a viable sensing approach that generalizes to new users and tasks.
Significance: This work shows promise for enabling accurate real-world biomechanical data collection and enhancement of biofeedback systems and wearable robot control.
期刊介绍:
IEEE Transactions on Biomedical Engineering contains basic and applied papers dealing with biomedical engineering. Papers range from engineering development in methods and techniques with biomedical applications to experimental and clinical investigations with engineering contributions.