Shreyas Kulkarni, S. R, Rahul Rk, Harshith M, Sahana Srikanth, Sanjeev Gurugopinath
{"title":"Human Activity Recognition Using Elliptical and Archimedean R-Vine Copulas with Multimodal Data","authors":"Shreyas Kulkarni, S. R, Rahul Rk, Harshith M, Sahana Srikanth, Sanjeev Gurugopinath","doi":"10.1109/CONECCT52877.2021.9622736","DOIUrl":null,"url":null,"abstract":"We consider the problem of deep neural network (DNN)-based classification of the human activity using wearable sensors with multimodal data. The accelerometer and gyroscope sensors embedded in smart phones and smart watches are used to recognize the human activity. We employ a regular-vine (R-Vine) copula-based fusion of the high-level features extracted from the neural networks. In particular, we consider bivariate copulas from two different families, namely elliptical and Archimedean, to build the underlying R-Vine tree structure. Comparison of the two families are brought out using the publicly available STISEN data set. To evaluate the efficacy of these algorithms in a real-time scenario, we created a data set called PESHAR using MPU-6050 module with a Raspberry-pi. For the given data set, we found that the elliptical family of R-Vine structure has proved to be better in terms of ${F_{1}}$ scores and confusion matrices, as compared to the Archimedean family.","PeriodicalId":164499,"journal":{"name":"2021 IEEE International Conference on Electronics, Computing and Communication Technologies (CONECCT)","volume":"40 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-07-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 IEEE International Conference on Electronics, Computing and Communication Technologies (CONECCT)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CONECCT52877.2021.9622736","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
We consider the problem of deep neural network (DNN)-based classification of the human activity using wearable sensors with multimodal data. The accelerometer and gyroscope sensors embedded in smart phones and smart watches are used to recognize the human activity. We employ a regular-vine (R-Vine) copula-based fusion of the high-level features extracted from the neural networks. In particular, we consider bivariate copulas from two different families, namely elliptical and Archimedean, to build the underlying R-Vine tree structure. Comparison of the two families are brought out using the publicly available STISEN data set. To evaluate the efficacy of these algorithms in a real-time scenario, we created a data set called PESHAR using MPU-6050 module with a Raspberry-pi. For the given data set, we found that the elliptical family of R-Vine structure has proved to be better in terms of ${F_{1}}$ scores and confusion matrices, as compared to the Archimedean family.