Human Activity Recognition based on Wearable Sensors using Multiscale DCNN Ensemble

Jessica Sena, W. R. Schwartz
{"title":"Human Activity Recognition based on Wearable Sensors using Multiscale DCNN Ensemble","authors":"Jessica Sena, W. R. Schwartz","doi":"10.5753/sibgrapi.est.2019.8310","DOIUrl":null,"url":null,"abstract":"Sensor-based Human Activity Recognition (HAR) provides valuable knowledge to many areas. Recently, wearable devices have gained space as a relevant source of data. However, there are two issues: large number of heterogeneous sensors available and the temporal nature of the sensor data. To handle these issues, we propose a multimodal approach that processes each sensor separately and, through an ensemble of Deep Convolution Neural Networks (DCNN), extracts information from multiple temporal scales of the sensor data. In this ensemble, we use a convolutional kernel with a different height for each DCNN. Considering that the number of rows in the sensor data reflects the data captured over time, each kernel height reflects a temporal scale from which we can extract patterns. Consequently, our approach is able to extract information from simple movement patterns such as a wrist twist when picking up a spoon, to complex movements such as the human gait. This multimodal and multi-temporal approach outperforms previous state-of-the-art works in seven important datasets using two different protocols. In addition, we demonstrate that the use of our proposed set of kernels improves sensor-based HAR in another multi-kernel approach, the widely employed inception network.","PeriodicalId":119031,"journal":{"name":"Anais Estendidos da Conference on Graphics, Patterns and Images (SIBGRAPI)","volume":"55 3","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-10-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Anais Estendidos da Conference on Graphics, Patterns and Images (SIBGRAPI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.5753/sibgrapi.est.2019.8310","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

Abstract

Sensor-based Human Activity Recognition (HAR) provides valuable knowledge to many areas. Recently, wearable devices have gained space as a relevant source of data. However, there are two issues: large number of heterogeneous sensors available and the temporal nature of the sensor data. To handle these issues, we propose a multimodal approach that processes each sensor separately and, through an ensemble of Deep Convolution Neural Networks (DCNN), extracts information from multiple temporal scales of the sensor data. In this ensemble, we use a convolutional kernel with a different height for each DCNN. Considering that the number of rows in the sensor data reflects the data captured over time, each kernel height reflects a temporal scale from which we can extract patterns. Consequently, our approach is able to extract information from simple movement patterns such as a wrist twist when picking up a spoon, to complex movements such as the human gait. This multimodal and multi-temporal approach outperforms previous state-of-the-art works in seven important datasets using two different protocols. In addition, we demonstrate that the use of our proposed set of kernels improves sensor-based HAR in another multi-kernel approach, the widely employed inception network.
基于多尺度DCNN集成的可穿戴传感器人体活动识别
基于传感器的人类活动识别(HAR)为许多领域提供了有价值的知识。最近,可穿戴设备作为一个相关的数据来源已经获得了空间。然而,有两个问题:大量的异构传感器和传感器数据的时间性质。为了解决这些问题,我们提出了一种多模态方法,该方法分别处理每个传感器,并通过深度卷积神经网络(DCNN)的集合,从传感器数据的多个时间尺度中提取信息。在这个集合中,我们为每个DCNN使用具有不同高度的卷积核。考虑到传感器数据中的行数反映了随时间捕获的数据,每个核高度反映了我们可以从中提取模式的时间尺度。因此,我们的方法能够从简单的运动模式(如拿起勺子时的手腕扭动)到复杂的运动(如人类的步态)中提取信息。这种多模式和多时间的方法在使用两种不同协议的七个重要数据集中优于先前的最先进的工作。此外,我们证明了使用我们提出的核集可以在另一种多核方法中改进基于传感器的HAR,即广泛使用的初始网络。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信