基于卷积神经网络表征学习的智能手机传感器活动识别

Tatsuhito Hasegawa, M. Koshino
{"title":"基于卷积神经网络表征学习的智能手机传感器活动识别","authors":"Tatsuhito Hasegawa, M. Koshino","doi":"10.1145/3372422.3372439","DOIUrl":null,"url":null,"abstract":"Although many researchers have widely investigated activity recognition using smartphone sensing, estimation accuracy can be adversely affected by individual dependence. The result of our survey showed that the process of smartphone sensor based activity recognition that has not been sufficiently discussed, especially using representation learning by Convolutional Neural Network (CNN). The effectiveness of the representation learning model using CNN in activity recognition was verified, as were 10 types of activity recognition models: Deep Neural Network (DNN) using Hand-Crafted (HC) features, simple CNN model, AlexNet, SE-AlexNet, Fully Convolutional Network (FCN), SE-FCN, VGG, SE-VGG, ResNet, and SE-ResNet, using a benchmark dataset for human activity recognition. Finally, the deep learning models were trained a total of 600 times (10 models, 6 types with varying the number of people in training dataset, and 10 trials to reduce the influence of randomness bias). The results indicate that SE-VGG is the most accurate, as many subjects can be comprised in the training data.","PeriodicalId":118684,"journal":{"name":"Proceedings of the 2019 2nd International Conference on Computational Intelligence and Intelligent Systems","volume":"8 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-11-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":"{\"title\":\"Representation Learning by Convolutional Neural Network for Smartphone Sensor Based Activity Recognition\",\"authors\":\"Tatsuhito Hasegawa, M. Koshino\",\"doi\":\"10.1145/3372422.3372439\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Although many researchers have widely investigated activity recognition using smartphone sensing, estimation accuracy can be adversely affected by individual dependence. The result of our survey showed that the process of smartphone sensor based activity recognition that has not been sufficiently discussed, especially using representation learning by Convolutional Neural Network (CNN). The effectiveness of the representation learning model using CNN in activity recognition was verified, as were 10 types of activity recognition models: Deep Neural Network (DNN) using Hand-Crafted (HC) features, simple CNN model, AlexNet, SE-AlexNet, Fully Convolutional Network (FCN), SE-FCN, VGG, SE-VGG, ResNet, and SE-ResNet, using a benchmark dataset for human activity recognition. Finally, the deep learning models were trained a total of 600 times (10 models, 6 types with varying the number of people in training dataset, and 10 trials to reduce the influence of randomness bias). The results indicate that SE-VGG is the most accurate, as many subjects can be comprised in the training data.\",\"PeriodicalId\":118684,\"journal\":{\"name\":\"Proceedings of the 2019 2nd International Conference on Computational Intelligence and Intelligent Systems\",\"volume\":\"8 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-11-23\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"6\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 2019 2nd International Conference on Computational Intelligence and Intelligent Systems\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3372422.3372439\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2019 2nd International Conference on Computational Intelligence and Intelligent Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3372422.3372439","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 6

摘要

尽管许多研究人员已经广泛研究了使用智能手机传感的活动识别,但个体依赖性会对估计精度产生不利影响。我们的调查结果表明,基于智能手机传感器的活动识别过程尚未得到充分的讨论,特别是使用卷积神经网络(CNN)的表示学习。验证了使用CNN的表示学习模型在活动识别中的有效性,以及10种类型的活动识别模型:使用手工(HC)特征的深度神经网络(DNN),简单CNN模型,AlexNet, SE-AlexNet,全卷积网络(FCN), SE-FCN, VGG, SE-VGG, ResNet和SE-ResNet,使用人类活动识别的基准数据集。最后,深度学习模型总共训练了600次(10个模型,6种不同训练数据集中人数的类型,10次试验以减少随机偏差的影响)。结果表明,SE-VGG是最准确的,因为训练数据中可以包含许多受试者。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Representation Learning by Convolutional Neural Network for Smartphone Sensor Based Activity Recognition
Although many researchers have widely investigated activity recognition using smartphone sensing, estimation accuracy can be adversely affected by individual dependence. The result of our survey showed that the process of smartphone sensor based activity recognition that has not been sufficiently discussed, especially using representation learning by Convolutional Neural Network (CNN). The effectiveness of the representation learning model using CNN in activity recognition was verified, as were 10 types of activity recognition models: Deep Neural Network (DNN) using Hand-Crafted (HC) features, simple CNN model, AlexNet, SE-AlexNet, Fully Convolutional Network (FCN), SE-FCN, VGG, SE-VGG, ResNet, and SE-ResNet, using a benchmark dataset for human activity recognition. Finally, the deep learning models were trained a total of 600 times (10 models, 6 types with varying the number of people in training dataset, and 10 trials to reduce the influence of randomness bias). The results indicate that SE-VGG is the most accurate, as many subjects can be comprised in the training data.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信