Arttu Lämsä, Jaakko Tervonen, Jussi Liikka, Constantino Álvarez Casado, Miguel Bordallo L'opez
{"title":"Video2IMU:逼真的IMU功能和视频信号","authors":"Arttu Lämsä, Jaakko Tervonen, Jussi Liikka, Constantino Álvarez Casado, Miguel Bordallo L'opez","doi":"10.1109/BSN56160.2022.9928466","DOIUrl":null,"url":null,"abstract":"Human Activity Recognition (HAR) from wearable sensor data identifies movements or activities in unconstrained environments. HAR is a challenging problem as it presents great variability across subjects. Obtaining large amounts of labelled data is not straightforward, since wearable sensor signals are not easy to label upon simple human inspection. In our work, we propose the use of neural networks for the generation of realistic signals and features using human activity monocular videos. We show how these generated features and signals can be utilized, instead of their real counterparts, to train HAR models that can recognize activities using signals obtained with wearable sensors. To prove the validity of our methods, we perform experiments on an activity recognition dataset created for the improvement of industrial work safety. We show that our model is able to realistically generate virtual sensor signals and features usable to train a HAR classifier with comparable performance as the one trained using real sensor data. Our results enable the use of available, labeled video data for training HAR models to classify signals from wearable sensors.","PeriodicalId":150990,"journal":{"name":"2022 IEEE-EMBS International Conference on Wearable and Implantable Body Sensor Networks (BSN)","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2022-02-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":"{\"title\":\"Video2IMU: Realistic IMU features and signals from videos\",\"authors\":\"Arttu Lämsä, Jaakko Tervonen, Jussi Liikka, Constantino Álvarez Casado, Miguel Bordallo L'opez\",\"doi\":\"10.1109/BSN56160.2022.9928466\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Human Activity Recognition (HAR) from wearable sensor data identifies movements or activities in unconstrained environments. HAR is a challenging problem as it presents great variability across subjects. Obtaining large amounts of labelled data is not straightforward, since wearable sensor signals are not easy to label upon simple human inspection. In our work, we propose the use of neural networks for the generation of realistic signals and features using human activity monocular videos. We show how these generated features and signals can be utilized, instead of their real counterparts, to train HAR models that can recognize activities using signals obtained with wearable sensors. To prove the validity of our methods, we perform experiments on an activity recognition dataset created for the improvement of industrial work safety. We show that our model is able to realistically generate virtual sensor signals and features usable to train a HAR classifier with comparable performance as the one trained using real sensor data. Our results enable the use of available, labeled video data for training HAR models to classify signals from wearable sensors.\",\"PeriodicalId\":150990,\"journal\":{\"name\":\"2022 IEEE-EMBS International Conference on Wearable and Implantable Body Sensor Networks (BSN)\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-02-14\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"4\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 IEEE-EMBS International Conference on Wearable and Implantable Body Sensor Networks (BSN)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/BSN56160.2022.9928466\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE-EMBS International Conference on Wearable and Implantable Body Sensor Networks (BSN)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/BSN56160.2022.9928466","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Video2IMU: Realistic IMU features and signals from videos
Human Activity Recognition (HAR) from wearable sensor data identifies movements or activities in unconstrained environments. HAR is a challenging problem as it presents great variability across subjects. Obtaining large amounts of labelled data is not straightforward, since wearable sensor signals are not easy to label upon simple human inspection. In our work, we propose the use of neural networks for the generation of realistic signals and features using human activity monocular videos. We show how these generated features and signals can be utilized, instead of their real counterparts, to train HAR models that can recognize activities using signals obtained with wearable sensors. To prove the validity of our methods, we perform experiments on an activity recognition dataset created for the improvement of industrial work safety. We show that our model is able to realistically generate virtual sensor signals and features usable to train a HAR classifier with comparable performance as the one trained using real sensor data. Our results enable the use of available, labeled video data for training HAR models to classify signals from wearable sensors.