Human Activity Recognition Using Elliptical and Archimedean R-Vine Copulas with Multimodal Data

Shreyas Kulkarni, S. R, Rahul Rk, Harshith M, Sahana Srikanth, Sanjeev Gurugopinath
{"title":"Human Activity Recognition Using Elliptical and Archimedean R-Vine Copulas with Multimodal Data","authors":"Shreyas Kulkarni, S. R, Rahul Rk, Harshith M, Sahana Srikanth, Sanjeev Gurugopinath","doi":"10.1109/CONECCT52877.2021.9622736","DOIUrl":null,"url":null,"abstract":"We consider the problem of deep neural network (DNN)-based classification of the human activity using wearable sensors with multimodal data. The accelerometer and gyroscope sensors embedded in smart phones and smart watches are used to recognize the human activity. We employ a regular-vine (R-Vine) copula-based fusion of the high-level features extracted from the neural networks. In particular, we consider bivariate copulas from two different families, namely elliptical and Archimedean, to build the underlying R-Vine tree structure. Comparison of the two families are brought out using the publicly available STISEN data set. To evaluate the efficacy of these algorithms in a real-time scenario, we created a data set called PESHAR using MPU-6050 module with a Raspberry-pi. For the given data set, we found that the elliptical family of R-Vine structure has proved to be better in terms of ${F_{1}}$ scores and confusion matrices, as compared to the Archimedean family.","PeriodicalId":164499,"journal":{"name":"2021 IEEE International Conference on Electronics, Computing and Communication Technologies (CONECCT)","volume":"40 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-07-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 IEEE International Conference on Electronics, Computing and Communication Technologies (CONECCT)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CONECCT52877.2021.9622736","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

We consider the problem of deep neural network (DNN)-based classification of the human activity using wearable sensors with multimodal data. The accelerometer and gyroscope sensors embedded in smart phones and smart watches are used to recognize the human activity. We employ a regular-vine (R-Vine) copula-based fusion of the high-level features extracted from the neural networks. In particular, we consider bivariate copulas from two different families, namely elliptical and Archimedean, to build the underlying R-Vine tree structure. Comparison of the two families are brought out using the publicly available STISEN data set. To evaluate the efficacy of these algorithms in a real-time scenario, we created a data set called PESHAR using MPU-6050 module with a Raspberry-pi. For the given data set, we found that the elliptical family of R-Vine structure has proved to be better in terms of ${F_{1}}$ scores and confusion matrices, as compared to the Archimedean family.
基于多模态数据的椭圆和阿基米德R-Vine copula人类活动识别
研究了基于深度神经网络(DNN)的多模态可穿戴传感器人体活动分类问题。智能手机和智能手表中嵌入的加速度计和陀螺仪传感器用于识别人类活动。我们采用正则vine (R-Vine)基于copula的融合从神经网络中提取的高级特征。特别地,我们考虑了来自两个不同科的二元copuls,即椭圆和阿基米德,来构建底层的R-Vine树结构。两个家庭的比较是使用公开可用的STISEN数据集得出的。为了评估这些算法在实时场景中的有效性,我们使用MPU-6050模块和树莓派创建了一个名为PESHAR的数据集。对于给定的数据集,我们发现R-Vine结构的椭圆族在${F_{1}}$分数和混淆矩阵方面比阿基米德族更好。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信