基于大腿运动的惯性测量单元人体识别步态分析

Lloyd Vincent R. Asuncion, Joan Xyrel P. De Mesa, Patrick Kyle H. Juan, Nathaniel T. Sayson, A. Cruz
{"title":"基于大腿运动的惯性测量单元人体识别步态分析","authors":"Lloyd Vincent R. Asuncion, Joan Xyrel P. De Mesa, Patrick Kyle H. Juan, Nathaniel T. Sayson, A. Cruz","doi":"10.1109/HNICEM.2018.8666422","DOIUrl":null,"url":null,"abstract":"Data security is an increasing concern due to the rapid pace of technological development and Internet of Things (IoT) implementation today. Mobile smartphones in particular is becoming a common place in the handling of sensitive information, leaving these devices vulnerable to data breaches. Biometric authentication is a viable alternative to current mobile phone security methods due to it being inherent to an individual. One biometric authentication parameter of active interest is the human gait. Sensor-based gait identification, in particular, is widely being researched due to the advantages of motion sensors being portable, wearable, and able to capture 3D motion. In this study, the researchers emulate a smartphone‘s IMU using two sensors that are simultaneously placed on the right and left thigh of 10 volunteers, of ages 2026, emulating the two most common placements of a smartphone. The acquired gait data from the IMUs, pitch, roll, and yaw angles, of the volunteers are the variables of this study. This study demonstrates the potential of human gait in biometric authentication with a Convolutional Neural Network gait identification algorithm. The algorithm is applied on 4 datasets, 3 of which are single-parameter datasets and 1 comprising of all the three parameters, roll, pitch, and yaw. For both left and right thigh data, the highest classification accuracy (98.34%) and precision (98.42%) were yielded by the three-parameter dataset, followed by the dataset comprising only of yaw parameter with a highest yielded average accuracy of 93.02% and an average yielded precision of 93.82%. The elapsed time during the training of each dataset is also recorded. The CNN training duration of the three-parameter dataset took almost 3.6 times longer than that of a single-parameter dataset.","PeriodicalId":426103,"journal":{"name":"2018 IEEE 10th International Conference on Humanoid, Nanotechnology, Information Technology,Communication and Control, Environment and Management (HNICEM)","volume":"7 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"13","resultStr":"{\"title\":\"Thigh Motion-Based Gait Analysis for Human Identification using Inertial Measurement Units (IMUs)\",\"authors\":\"Lloyd Vincent R. Asuncion, Joan Xyrel P. De Mesa, Patrick Kyle H. Juan, Nathaniel T. Sayson, A. Cruz\",\"doi\":\"10.1109/HNICEM.2018.8666422\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Data security is an increasing concern due to the rapid pace of technological development and Internet of Things (IoT) implementation today. Mobile smartphones in particular is becoming a common place in the handling of sensitive information, leaving these devices vulnerable to data breaches. Biometric authentication is a viable alternative to current mobile phone security methods due to it being inherent to an individual. One biometric authentication parameter of active interest is the human gait. Sensor-based gait identification, in particular, is widely being researched due to the advantages of motion sensors being portable, wearable, and able to capture 3D motion. In this study, the researchers emulate a smartphone‘s IMU using two sensors that are simultaneously placed on the right and left thigh of 10 volunteers, of ages 2026, emulating the two most common placements of a smartphone. The acquired gait data from the IMUs, pitch, roll, and yaw angles, of the volunteers are the variables of this study. This study demonstrates the potential of human gait in biometric authentication with a Convolutional Neural Network gait identification algorithm. The algorithm is applied on 4 datasets, 3 of which are single-parameter datasets and 1 comprising of all the three parameters, roll, pitch, and yaw. For both left and right thigh data, the highest classification accuracy (98.34%) and precision (98.42%) were yielded by the three-parameter dataset, followed by the dataset comprising only of yaw parameter with a highest yielded average accuracy of 93.02% and an average yielded precision of 93.82%. The elapsed time during the training of each dataset is also recorded. The CNN training duration of the three-parameter dataset took almost 3.6 times longer than that of a single-parameter dataset.\",\"PeriodicalId\":426103,\"journal\":{\"name\":\"2018 IEEE 10th International Conference on Humanoid, Nanotechnology, Information Technology,Communication and Control, Environment and Management (HNICEM)\",\"volume\":\"7 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2018-11-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"13\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2018 IEEE 10th International Conference on Humanoid, Nanotechnology, Information Technology,Communication and Control, Environment and Management (HNICEM)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/HNICEM.2018.8666422\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2018 IEEE 10th International Conference on Humanoid, Nanotechnology, Information Technology,Communication and Control, Environment and Management (HNICEM)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/HNICEM.2018.8666422","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 13

摘要

由于当今技术发展和物联网(IoT)的快速实施,数据安全日益受到关注。尤其是移动智能手机,正在成为处理敏感信息的常见场所,这使得这些设备容易受到数据泄露的影响。生物识别认证是一种可行的替代目前的移动电话安全方法,因为它是固有的个人。人体步态是生物识别认证的一个重要参数。特别是基于传感器的步态识别,由于运动传感器具有便携、可穿戴和能够捕捉三维运动的优点,正在得到广泛的研究。在这项研究中,研究人员将两个传感器同时放在10名年龄在2026岁的志愿者的左右大腿上,模拟智能手机的两种最常见的放置位置,以此模拟智能手机的IMU。从imu中获得的步态数据,俯仰,翻滚和偏航角,是本研究的变量。本研究利用卷积神经网络步态识别算法证明了人类步态在生物特征认证中的潜力。该算法应用于4个数据集,其中3个数据集为单参数数据集,1个数据集包含滚转、俯仰和偏航三个参数。对于左右大腿数据,三参数数据集的分类准确率最高(98.34%),精度最高(98.42%);仅包含偏航参数数据集的分类准确率最高,平均准确率为93.02%,平均精度为93.82%。还记录了每个数据集训练期间的运行时间。三参数数据集的CNN训练时间几乎是单参数数据集的3.6倍。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Thigh Motion-Based Gait Analysis for Human Identification using Inertial Measurement Units (IMUs)
Data security is an increasing concern due to the rapid pace of technological development and Internet of Things (IoT) implementation today. Mobile smartphones in particular is becoming a common place in the handling of sensitive information, leaving these devices vulnerable to data breaches. Biometric authentication is a viable alternative to current mobile phone security methods due to it being inherent to an individual. One biometric authentication parameter of active interest is the human gait. Sensor-based gait identification, in particular, is widely being researched due to the advantages of motion sensors being portable, wearable, and able to capture 3D motion. In this study, the researchers emulate a smartphone‘s IMU using two sensors that are simultaneously placed on the right and left thigh of 10 volunteers, of ages 2026, emulating the two most common placements of a smartphone. The acquired gait data from the IMUs, pitch, roll, and yaw angles, of the volunteers are the variables of this study. This study demonstrates the potential of human gait in biometric authentication with a Convolutional Neural Network gait identification algorithm. The algorithm is applied on 4 datasets, 3 of which are single-parameter datasets and 1 comprising of all the three parameters, roll, pitch, and yaw. For both left and right thigh data, the highest classification accuracy (98.34%) and precision (98.42%) were yielded by the three-parameter dataset, followed by the dataset comprising only of yaw parameter with a highest yielded average accuracy of 93.02% and an average yielded precision of 93.82%. The elapsed time during the training of each dataset is also recorded. The CNN training duration of the three-parameter dataset took almost 3.6 times longer than that of a single-parameter dataset.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信