基于类初始结构和循环网络的惯性传感器步态识别

Ha V. Hoang, M. Tran
{"title":"基于类初始结构和循环网络的惯性传感器步态识别","authors":"Ha V. Hoang, M. Tran","doi":"10.1109/CIS.2017.00138","DOIUrl":null,"url":null,"abstract":"Gait recognition has been considered as a new promising approach for biometric-based authentication. Gait signals are commonly obtained by collecting motion data from inertial sensors (accelerometers, gyroscopes) integrated in mobile and wearable devices. Motion data is subsequently transformed to a feature space for recognition procedure. One fashionable, effective way to extract features automatically is using conventional Convolutional Neural Networks (CNN) as feature extractors. In this paper, we propose DeepSense-Inception (DSI), a new method inspired from DeepSense, to recognize users from their gait features using Inception-like modules for better feature extraction than conventional CNN. Experiments for user identification on UCI Human Activity Recognition dataset demonstrate that our method not only achieves an accuracy of 99.9%, higher than that of DeepSense (99.7%), but also uses only 149K parameters, less than one third of the parameters in DeepSense (529K parameters). Thus, our method can be implemented more efficiently in limited resource systems.","PeriodicalId":304958,"journal":{"name":"2017 13th International Conference on Computational Intelligence and Security (CIS)","volume":"189 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":"{\"title\":\"DeepSense-Inception: Gait Identification from Inertial Sensors with Inception-like Architecture and Recurrent Network\",\"authors\":\"Ha V. Hoang, M. Tran\",\"doi\":\"10.1109/CIS.2017.00138\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Gait recognition has been considered as a new promising approach for biometric-based authentication. Gait signals are commonly obtained by collecting motion data from inertial sensors (accelerometers, gyroscopes) integrated in mobile and wearable devices. Motion data is subsequently transformed to a feature space for recognition procedure. One fashionable, effective way to extract features automatically is using conventional Convolutional Neural Networks (CNN) as feature extractors. In this paper, we propose DeepSense-Inception (DSI), a new method inspired from DeepSense, to recognize users from their gait features using Inception-like modules for better feature extraction than conventional CNN. Experiments for user identification on UCI Human Activity Recognition dataset demonstrate that our method not only achieves an accuracy of 99.9%, higher than that of DeepSense (99.7%), but also uses only 149K parameters, less than one third of the parameters in DeepSense (529K parameters). Thus, our method can be implemented more efficiently in limited resource systems.\",\"PeriodicalId\":304958,\"journal\":{\"name\":\"2017 13th International Conference on Computational Intelligence and Security (CIS)\",\"volume\":\"189 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2017-12-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"4\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2017 13th International Conference on Computational Intelligence and Security (CIS)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/CIS.2017.00138\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 13th International Conference on Computational Intelligence and Security (CIS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CIS.2017.00138","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4

摘要

步态识别被认为是一种很有前途的基于生物特征的身份验证方法。步态信号通常通过从集成在移动和可穿戴设备中的惯性传感器(加速度计、陀螺仪)收集运动数据来获得。随后将运动数据转换为特征空间进行识别。一种流行的、有效的自动提取特征的方法是使用传统卷积神经网络(CNN)作为特征提取器。在本文中,我们提出了一种受DeepSense启发的新方法DeepSense- inception (DSI),该方法使用类似inception的模块从步态特征中识别用户,以获得比传统CNN更好的特征提取。在UCI人类活动识别数据集上的用户识别实验表明,我们的方法不仅达到99.9%的准确率,高于DeepSense(99.7%),而且仅使用149K个参数,不到DeepSense (529K个参数)的三分之一。因此,我们的方法可以在资源有限的系统中更有效地实现。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
DeepSense-Inception: Gait Identification from Inertial Sensors with Inception-like Architecture and Recurrent Network
Gait recognition has been considered as a new promising approach for biometric-based authentication. Gait signals are commonly obtained by collecting motion data from inertial sensors (accelerometers, gyroscopes) integrated in mobile and wearable devices. Motion data is subsequently transformed to a feature space for recognition procedure. One fashionable, effective way to extract features automatically is using conventional Convolutional Neural Networks (CNN) as feature extractors. In this paper, we propose DeepSense-Inception (DSI), a new method inspired from DeepSense, to recognize users from their gait features using Inception-like modules for better feature extraction than conventional CNN. Experiments for user identification on UCI Human Activity Recognition dataset demonstrate that our method not only achieves an accuracy of 99.9%, higher than that of DeepSense (99.7%), but also uses only 149K parameters, less than one third of the parameters in DeepSense (529K parameters). Thus, our method can be implemented more efficiently in limited resource systems.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信