Worrakit Sanpote, Ponnipa Jantawong, Narit Hnoohom, A. Jitpattanakul, S. Mekruksavanich
{"title":"Deep Learning Approaches for Recognizing Daily Human Activities Using Smart Home Sensors","authors":"Worrakit Sanpote, Ponnipa Jantawong, Narit Hnoohom, A. Jitpattanakul, S. Mekruksavanich","doi":"10.1109/ECTIDAMTNCON57770.2023.10139507","DOIUrl":null,"url":null,"abstract":"Nowadays, one of the most important objectives in health-related research is the improvement of the living condition and well-being of people. Smart home systems can provide health protection for residents based on the results of daily activity recognition. Recent advances and developments in sensor technology have increased the need for sensor-compatible goods and services in smart homes. Consequently, the ever-increasing volume of data requires the field of deep learning (DL) for auto-matic human motion recognition. Recent research has modeled spatiotemporal sequences gathered by smart home sensors using long short-term memory networks. In this work, ResNeXt-based models that learn to classify human activities in smart homes were proposed to improve recognition performance. Experiments conducted on Center for Advanced Studies in Adaptive Systems (CASAS) data, a publicly available benchmark dataset, shows that the proposed ResNeXt-based techniques are significantly superior to the existing DL methods and provide better results compared to the existing literature. The ResNeXt model achieved the averaged accuracy over the benchmark method to 84.81%, 93.57%, and 90.38% for the CASAS_Cairo, CASAS_Milan and CASAS_Kyoto3 datasets, respectively.","PeriodicalId":38808,"journal":{"name":"Transactions on Electrical Engineering, Electronics, and Communications","volume":"6 1","pages":"469-473"},"PeriodicalIF":0.0000,"publicationDate":"2023-03-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Transactions on Electrical Engineering, Electronics, and Communications","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ECTIDAMTNCON57770.2023.10139507","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"Engineering","Score":null,"Total":0}
引用次数: 0
Abstract
Nowadays, one of the most important objectives in health-related research is the improvement of the living condition and well-being of people. Smart home systems can provide health protection for residents based on the results of daily activity recognition. Recent advances and developments in sensor technology have increased the need for sensor-compatible goods and services in smart homes. Consequently, the ever-increasing volume of data requires the field of deep learning (DL) for auto-matic human motion recognition. Recent research has modeled spatiotemporal sequences gathered by smart home sensors using long short-term memory networks. In this work, ResNeXt-based models that learn to classify human activities in smart homes were proposed to improve recognition performance. Experiments conducted on Center for Advanced Studies in Adaptive Systems (CASAS) data, a publicly available benchmark dataset, shows that the proposed ResNeXt-based techniques are significantly superior to the existing DL methods and provide better results compared to the existing literature. The ResNeXt model achieved the averaged accuracy over the benchmark method to 84.81%, 93.57%, and 90.38% for the CASAS_Cairo, CASAS_Milan and CASAS_Kyoto3 datasets, respectively.
当今,健康相关研究的重要目标之一是改善人们的生活条件和福祉。智能家居系统可以根据日常活动识别的结果为居民提供健康保护。传感器技术的最新进步和发展增加了智能家居中对传感器兼容商品和服务的需求。因此,不断增加的数据量需要深度学习(DL)领域的自动人体运动识别。最近的研究利用长短期记忆网络对智能家居传感器收集的时空序列进行了建模。在这项工作中,提出了基于resnext的模型来学习对智能家居中的人类活动进行分类,以提高识别性能。在自适应系统高级研究中心(Center for Advanced Studies in Adaptive Systems, CASAS)数据(一个公开的基准数据集)上进行的实验表明,所提出的基于resnext的技术明显优于现有的深度学习方法,并且与现有文献相比提供了更好的结果。对于CASAS_Cairo、CASAS_Milan和CASAS_Kyoto3数据集,ResNeXt模型比基准方法的平均准确率分别达到84.81%、93.57%和90.38%。