Get Up!: Assessing Postural Activity & Transitions using Bi-Directional Gated Recurrent Units (Bi-GRUs) on Smartphone Motion Data

kar 2402565399 ku, Luke Buquicchio, Walter Gerych, E. Agu, Elke A. Rundensteiner
{"title":"Get Up!: Assessing Postural Activity & Transitions using Bi-Directional Gated Recurrent Units (Bi-GRUs) on Smartphone Motion Data","authors":"kar 2402565399 ku, Luke Buquicchio, Walter Gerych, E. Agu, Elke A. Rundensteiner","doi":"10.1109/HI-POCT45284.2019.8962729","DOIUrl":null,"url":null,"abstract":"Many health conditions can affect a person’s mobility. Consequently, a person’s ability to perform transitions between activity states (e.g. sit-to-stand) are accurate measures of their mobility and general health. Mobility impairments can manifest either as discomfort while performing certain activity transitions or a complete inability to perform such transitions. The Timed up and Go (TUG) is an important clinical test that assesses patients’ sit-to-stand abilities. Research into passive methods to assess the quality of patients activity transitions and thus conduct the Timed Up and Go autonomously as they live their lives, have recently become popular. Machine and deep learning analysis of smartphone accelerometer and gyroscope data have demonstrated promising activity and transition recognition results. In this paper, we present Get Up!, a novel deep learning-based method to detect whether a person is performing a certain postural activity or transitioning between activities. Get Up! analyzes data from the accelerometer and gyroscope of the patient’s smartphone using Bi-Directional Gated Recurrent Units (Bi-GRU) neural networks with an attention mechanism. Our method outperforms TAHAR, the current state of the art machine learning method, achieving an error rate of 1.47% for activity classification and an accuracy of 97%. We also achieved an error rate of 0.17% with an accuracy of 93.3% when classifying postural transitions. As Get Up! segments activities and transitions, individual TUG sub-components can be timed to identify sub-components that patients find challenging.","PeriodicalId":269346,"journal":{"name":"2019 IEEE Healthcare Innovations and Point of Care Technologies, (HI-POCT)","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2019-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 IEEE Healthcare Innovations and Point of Care Technologies, (HI-POCT)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/HI-POCT45284.2019.8962729","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

Many health conditions can affect a person’s mobility. Consequently, a person’s ability to perform transitions between activity states (e.g. sit-to-stand) are accurate measures of their mobility and general health. Mobility impairments can manifest either as discomfort while performing certain activity transitions or a complete inability to perform such transitions. The Timed up and Go (TUG) is an important clinical test that assesses patients’ sit-to-stand abilities. Research into passive methods to assess the quality of patients activity transitions and thus conduct the Timed Up and Go autonomously as they live their lives, have recently become popular. Machine and deep learning analysis of smartphone accelerometer and gyroscope data have demonstrated promising activity and transition recognition results. In this paper, we present Get Up!, a novel deep learning-based method to detect whether a person is performing a certain postural activity or transitioning between activities. Get Up! analyzes data from the accelerometer and gyroscope of the patient’s smartphone using Bi-Directional Gated Recurrent Units (Bi-GRU) neural networks with an attention mechanism. Our method outperforms TAHAR, the current state of the art machine learning method, achieving an error rate of 1.47% for activity classification and an accuracy of 97%. We also achieved an error rate of 0.17% with an accuracy of 93.3% when classifying postural transitions. As Get Up! segments activities and transitions, individual TUG sub-components can be timed to identify sub-components that patients find challenging.
起来!在智能手机运动数据上使用双向门控循环单元(Bi-GRUs)评估姿势活动和转换
许多健康状况都会影响一个人的行动能力。因此,一个人在活动状态(例如从坐到站)之间进行转换的能力是衡量其活动能力和总体健康状况的准确指标。行动障碍可以表现为在进行某些活动转换时的不适,也可以表现为完全无法进行这些转换。TUG (Timed up and Go)是一项重要的临床测试,用于评估患者的坐立能力。对被动方法的研究,以评估患者活动过渡的质量,从而在他们的生活中自主地进行定时起床和走,最近变得流行起来。智能手机加速计和陀螺仪数据的机器和深度学习分析显示了有希望的活动和过渡识别结果。在本文中,我们提出了Get Up!这是一种基于深度学习的新方法,用于检测一个人是否正在进行某种姿势活动或在活动之间转换。起来!利用双向门控循环单元(Bi-GRU)神经网络和注意力机制,分析来自患者智能手机加速计和陀螺仪的数据。我们的方法优于当前最先进的机器学习方法TAHAR,活动分类的错误率为1.47%,准确率为97%。在对姿势转换进行分类时,我们的错误率为0.17%,准确率为93.3%。《起来!》分段活动和过渡,单个TUG子组件可以定时识别患者认为具有挑战性的子组件。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信