Gait analysis on daily data using IMUs in smart phones, watch and earbuds.

IF 2.4
Zhuoli Wang, Yuta Sugiura
{"title":"Gait analysis on daily data using IMUs in smart phones, watch and earbuds.","authors":"Zhuoli Wang, Yuta Sugiura","doi":"10.1016/j.gaitpost.2025.09.003","DOIUrl":null,"url":null,"abstract":"<p><strong>Background: </strong>The use of inertial measurement unit (IMU) sensors for gait analysis has become more prevalent. More options are being offered under the development of wearable smart devices and smartphones. These techniques provide a cost-effective way to collect motion data from everyday activities, addressing the limitations of controlled laboratory environments. Despite the potential of these technologies, there are still many challenges in analyzing gait data from everyday life.</p><p><strong>Method: </strong>Experiments involved 16 participants (7 women, 9 men; mean age: 27.69 years) who performed walking, jogging, and going up and down stairs under three smartphone-carrying conditions: pocket, backpack, and shoulder bag. Data were collected using iPhone 14, Apple Watch Series 10, and AirPods Pro, supplemented with Xsens motion capture for ground truth. IMU data from accelerometers and gyroscopes were preprocessed and standardized before applying Principal Component Analysis (PCA). A novel sliding window-based algorithm was developed for gait segmentation and grouping, featuring a Continuity-Matching Score (CMS) for evaluating both continuity and match quality.</p><p><strong>Result: </strong>The proposed algorithm achieved an overall segmentation accuracy of 89.25%, with the highest performance (90.38%) observed when the smartphone was carried in a pocket. Rand Index values confirmed reliable gait grouping, with minor accuracy reductions under more dynamic carrying conditions, such as backpacks. For walk-only dataset, segmentation accuracy improved to 95.67%, while for run-only dataset, the accuracy reached 96.21%.</p><p><strong>Conclusion: </strong>This study introduced a system for daily-life gait analysis using consumer-grade IMU-equipped devices. The algorithm is capable of handling data containing multiple gait types, achieving reliable segmentation and grouping of synchronous gaits. Future work will focus on enhancing algorithm adaptability to dynamic environments and expanding its applicability to larger and more diverse datasets.</p>","PeriodicalId":94018,"journal":{"name":"Gait & posture","volume":" ","pages":"109969"},"PeriodicalIF":2.4000,"publicationDate":"2025-09-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Gait & posture","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1016/j.gaitpost.2025.09.003","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Background: The use of inertial measurement unit (IMU) sensors for gait analysis has become more prevalent. More options are being offered under the development of wearable smart devices and smartphones. These techniques provide a cost-effective way to collect motion data from everyday activities, addressing the limitations of controlled laboratory environments. Despite the potential of these technologies, there are still many challenges in analyzing gait data from everyday life.

Method: Experiments involved 16 participants (7 women, 9 men; mean age: 27.69 years) who performed walking, jogging, and going up and down stairs under three smartphone-carrying conditions: pocket, backpack, and shoulder bag. Data were collected using iPhone 14, Apple Watch Series 10, and AirPods Pro, supplemented with Xsens motion capture for ground truth. IMU data from accelerometers and gyroscopes were preprocessed and standardized before applying Principal Component Analysis (PCA). A novel sliding window-based algorithm was developed for gait segmentation and grouping, featuring a Continuity-Matching Score (CMS) for evaluating both continuity and match quality.

Result: The proposed algorithm achieved an overall segmentation accuracy of 89.25%, with the highest performance (90.38%) observed when the smartphone was carried in a pocket. Rand Index values confirmed reliable gait grouping, with minor accuracy reductions under more dynamic carrying conditions, such as backpacks. For walk-only dataset, segmentation accuracy improved to 95.67%, while for run-only dataset, the accuracy reached 96.21%.

Conclusion: This study introduced a system for daily-life gait analysis using consumer-grade IMU-equipped devices. The algorithm is capable of handling data containing multiple gait types, achieving reliable segmentation and grouping of synchronous gaits. Future work will focus on enhancing algorithm adaptability to dynamic environments and expanding its applicability to larger and more diverse datasets.

使用智能手机、手表和耳塞中的imu对日常数据进行步态分析。
背景:惯性测量单元(IMU)传感器用于步态分析已经变得越来越普遍。随着可穿戴智能设备和智能手机的发展,提供了更多的选择。这些技术为从日常活动中收集运动数据提供了一种经济有效的方法,解决了受控实验室环境的局限性。尽管这些技术具有潜力,但在分析日常生活中的步态数据方面仍然存在许多挑战。方法:实验涉及16名参与者(7名女性,9名男性,平均年龄:27.69岁),他们在三种携带智能手机的条件下进行散步、慢跑和上下楼梯:口袋、背包和肩包。使用iPhone 14, Apple Watch Series 10和AirPods Pro收集数据,并辅以Xsens动作捕捉来获取地面真相。在应用主成分分析(PCA)之前,对加速度计和陀螺仪的IMU数据进行了预处理和标准化。提出了一种新的基于滑动窗口的步态分割和分组算法,该算法采用连续性匹配评分(CMS)来评价连续性和匹配质量。结果:所提算法的整体分割准确率为89.25%,其中口袋携带时的分割准确率最高,达到90.38%。Rand指数值证实了可靠的步态分组,在更动态的携带条件下,如背包,准确性略有降低。对于只行走数据集,分割准确率提高到95.67%,而对于只运行数据集,分割准确率达到96.21%。结论:本研究介绍了一种使用消费级imu设备进行日常生活步态分析的系统。该算法能够处理包含多种步态类型的数据,实现同步步态的可靠分割和分组。未来的工作将侧重于增强算法对动态环境的适应性,并将其扩展到更大、更多样化的数据集上。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信