Narit Hnoohom, Pitchaya Chotivatunyu, S. Mekruksavanich, A. Jitpattanakul
{"title":"使用LSTM-XGB与智能手机传感器识别静止和运动活动","authors":"Narit Hnoohom, Pitchaya Chotivatunyu, S. Mekruksavanich, A. Jitpattanakul","doi":"10.1109/ICSESS54813.2022.9930285","DOIUrl":null,"url":null,"abstract":"Nowadays, stationary and locomotion activity recognition, also known as SLAR, is becoming increasingly important in a variety of domains, such as indoor localization, fitness activity tracking, and elderly care. Currently used methods typically involve handcrafted feature extraction, a process that is both difficult and requires specialized knowledge, and results can still be subpar. We proposed a deep learning technique for SLAR called LSTM-XGB that uses data from inertial sensors in smartphones to reduce the effort required for feature development and selection. The proposed LSTM-XGB consists of multiple stacked LSTM layers to automatically learn the temporal features of the input, followed by XGBoost for label prediction in the final layer. The results showed that the proposed LSTM-XGB technique, which automatically extracts features, outperforms conventional machine learning that requires manual feature extraction. We also showed that sensor data from three sensors (accelerometer, linear acceleration, and gyroscope) can be combined. This achieved higher accuracy than other combinations or single sensors.","PeriodicalId":265412,"journal":{"name":"2022 IEEE 13th International Conference on Software Engineering and Service Science (ICSESS)","volume":"10 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-10-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Recognizing Stationary and Locomotion Activities using LSTM-XGB with Smartphone Sensors\",\"authors\":\"Narit Hnoohom, Pitchaya Chotivatunyu, S. Mekruksavanich, A. Jitpattanakul\",\"doi\":\"10.1109/ICSESS54813.2022.9930285\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Nowadays, stationary and locomotion activity recognition, also known as SLAR, is becoming increasingly important in a variety of domains, such as indoor localization, fitness activity tracking, and elderly care. Currently used methods typically involve handcrafted feature extraction, a process that is both difficult and requires specialized knowledge, and results can still be subpar. We proposed a deep learning technique for SLAR called LSTM-XGB that uses data from inertial sensors in smartphones to reduce the effort required for feature development and selection. The proposed LSTM-XGB consists of multiple stacked LSTM layers to automatically learn the temporal features of the input, followed by XGBoost for label prediction in the final layer. The results showed that the proposed LSTM-XGB technique, which automatically extracts features, outperforms conventional machine learning that requires manual feature extraction. We also showed that sensor data from three sensors (accelerometer, linear acceleration, and gyroscope) can be combined. This achieved higher accuracy than other combinations or single sensors.\",\"PeriodicalId\":265412,\"journal\":{\"name\":\"2022 IEEE 13th International Conference on Software Engineering and Service Science (ICSESS)\",\"volume\":\"10 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-10-21\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 IEEE 13th International Conference on Software Engineering and Service Science (ICSESS)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICSESS54813.2022.9930285\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE 13th International Conference on Software Engineering and Service Science (ICSESS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICSESS54813.2022.9930285","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Recognizing Stationary and Locomotion Activities using LSTM-XGB with Smartphone Sensors
Nowadays, stationary and locomotion activity recognition, also known as SLAR, is becoming increasingly important in a variety of domains, such as indoor localization, fitness activity tracking, and elderly care. Currently used methods typically involve handcrafted feature extraction, a process that is both difficult and requires specialized knowledge, and results can still be subpar. We proposed a deep learning technique for SLAR called LSTM-XGB that uses data from inertial sensors in smartphones to reduce the effort required for feature development and selection. The proposed LSTM-XGB consists of multiple stacked LSTM layers to automatically learn the temporal features of the input, followed by XGBoost for label prediction in the final layer. The results showed that the proposed LSTM-XGB technique, which automatically extracts features, outperforms conventional machine learning that requires manual feature extraction. We also showed that sensor data from three sensors (accelerometer, linear acceleration, and gyroscope) can be combined. This achieved higher accuracy than other combinations or single sensors.