{"title":"基于智能手机的人类活动和跌倒识别,使用深度特征提取和机器学习分类器","authors":"Laksamee Nooyimsai, Onnicha Pakdeepong, Supajitra Chatchawalvoradech, Tipkasem Phiakhan, Seksan Laitrakun","doi":"10.1109/iSAI-NLP56921.2022.9960250","DOIUrl":null,"url":null,"abstract":"Human activity recognition (HAR) and fall detection using smartphone sensors are currently popular because they can be extended to many useful applications especially when a person needs an urgent treatment such as a fall. Several methods based on machine learning (ML) and deep learning (DL) have been proposed to improve classification performances. In this work, we propose hybrid models of convolutional neural network (CNN) models and ML algorithms to classify human activities and falls using smartphone-sensor data. The CNN model will be used as feature extraction to extract a set of features. Thereafter, the ML algorithm will apply this set of features to predict the corresponding activity and fall. Several combinations of CNN models and ML algorithms are investigated on two public datasets: UniMiB SHAR and UMAFall. Their accuracy scores are compared in order to determine the best hybrid model. On the UniMiB SHAR dataset, the hybrid model based on the AlexN et model and the extra trees algorithm achieves the highest accuracy score of 95.27%. On the UMAFall dataset, the hybrid model based on the Xception model and the support vector machine/k-nearest neighbors/extra trees algorithms offer the highest accuracy score of 82.24 %.","PeriodicalId":399019,"journal":{"name":"2022 17th International Joint Symposium on Artificial Intelligence and Natural Language Processing (iSAI-NLP)","volume":"48 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-11-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Smartphone-Based Human Activity and Fall Recognition Using Deep Feature Extraction and Machine-Learning Classifiers\",\"authors\":\"Laksamee Nooyimsai, Onnicha Pakdeepong, Supajitra Chatchawalvoradech, Tipkasem Phiakhan, Seksan Laitrakun\",\"doi\":\"10.1109/iSAI-NLP56921.2022.9960250\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Human activity recognition (HAR) and fall detection using smartphone sensors are currently popular because they can be extended to many useful applications especially when a person needs an urgent treatment such as a fall. Several methods based on machine learning (ML) and deep learning (DL) have been proposed to improve classification performances. In this work, we propose hybrid models of convolutional neural network (CNN) models and ML algorithms to classify human activities and falls using smartphone-sensor data. The CNN model will be used as feature extraction to extract a set of features. Thereafter, the ML algorithm will apply this set of features to predict the corresponding activity and fall. Several combinations of CNN models and ML algorithms are investigated on two public datasets: UniMiB SHAR and UMAFall. Their accuracy scores are compared in order to determine the best hybrid model. On the UniMiB SHAR dataset, the hybrid model based on the AlexN et model and the extra trees algorithm achieves the highest accuracy score of 95.27%. On the UMAFall dataset, the hybrid model based on the Xception model and the support vector machine/k-nearest neighbors/extra trees algorithms offer the highest accuracy score of 82.24 %.\",\"PeriodicalId\":399019,\"journal\":{\"name\":\"2022 17th International Joint Symposium on Artificial Intelligence and Natural Language Processing (iSAI-NLP)\",\"volume\":\"48 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-11-05\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 17th International Joint Symposium on Artificial Intelligence and Natural Language Processing (iSAI-NLP)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/iSAI-NLP56921.2022.9960250\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 17th International Joint Symposium on Artificial Intelligence and Natural Language Processing (iSAI-NLP)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/iSAI-NLP56921.2022.9960250","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Smartphone-Based Human Activity and Fall Recognition Using Deep Feature Extraction and Machine-Learning Classifiers
Human activity recognition (HAR) and fall detection using smartphone sensors are currently popular because they can be extended to many useful applications especially when a person needs an urgent treatment such as a fall. Several methods based on machine learning (ML) and deep learning (DL) have been proposed to improve classification performances. In this work, we propose hybrid models of convolutional neural network (CNN) models and ML algorithms to classify human activities and falls using smartphone-sensor data. The CNN model will be used as feature extraction to extract a set of features. Thereafter, the ML algorithm will apply this set of features to predict the corresponding activity and fall. Several combinations of CNN models and ML algorithms are investigated on two public datasets: UniMiB SHAR and UMAFall. Their accuracy scores are compared in order to determine the best hybrid model. On the UniMiB SHAR dataset, the hybrid model based on the AlexN et model and the extra trees algorithm achieves the highest accuracy score of 95.27%. On the UMAFall dataset, the hybrid model based on the Xception model and the support vector machine/k-nearest neighbors/extra trees algorithms offer the highest accuracy score of 82.24 %.