{"title":"利用肌电图和 IMU 传感器准确高效识别健身房锻炼活动的残差深度学习方法","authors":"S. Mekruksavanich, A. Jitpattanakul","doi":"10.3390/asi7040059","DOIUrl":null,"url":null,"abstract":"The accurate and efficient recognition of gym workout activities using wearable sensors holds significant implications for assessing fitness levels, tailoring personalized training regimens, and overseeing rehabilitation progress. This study introduces CNN-ResBiGRU, a novel deep learning architecture that amalgamates residual and hybrid methodologies, aiming to precisely categorize gym exercises based on multimodal sensor data. The primary goal of this model is to effectively identify various gym workouts by integrating convolutional neural networks, residual connections, and bidirectional gated recurrent units. Raw electromyography and inertial measurement unit data collected from wearable sensors worn by individuals during strength training and gym sessions serve as inputs for the CNN-ResBiGRU model. Initially, convolutional neural network layers are employed to extract unique features in both temporal and spatial dimensions, capturing localized patterns within the sensor outputs. Subsequently, the extracted features are fed into the ResBiGRU component, leveraging residual connections and bidirectional processing to capture the exercise activities’ long-term temporal dependencies and contextual information. The performance of the proposed model is evaluated using the Myogym dataset, comprising data from 10 participants engaged in 30 distinct gym activities. The model achieves a classification accuracy of 97.29% and an F1-score of 92.68%. Ablation studies confirm the effectiveness of the convolutional neural network and ResBiGRU components. The proposed hybrid model uses wearable multimodal sensor data to accurately and efficiently recognize gym exercise activity.","PeriodicalId":36273,"journal":{"name":"Applied System Innovation","volume":null,"pages":null},"PeriodicalIF":3.8000,"publicationDate":"2024-07-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A Residual Deep Learning Method for Accurate and Efficient Recognition of Gym Exercise Activities Using Electromyography and IMU Sensors\",\"authors\":\"S. Mekruksavanich, A. Jitpattanakul\",\"doi\":\"10.3390/asi7040059\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The accurate and efficient recognition of gym workout activities using wearable sensors holds significant implications for assessing fitness levels, tailoring personalized training regimens, and overseeing rehabilitation progress. This study introduces CNN-ResBiGRU, a novel deep learning architecture that amalgamates residual and hybrid methodologies, aiming to precisely categorize gym exercises based on multimodal sensor data. The primary goal of this model is to effectively identify various gym workouts by integrating convolutional neural networks, residual connections, and bidirectional gated recurrent units. Raw electromyography and inertial measurement unit data collected from wearable sensors worn by individuals during strength training and gym sessions serve as inputs for the CNN-ResBiGRU model. Initially, convolutional neural network layers are employed to extract unique features in both temporal and spatial dimensions, capturing localized patterns within the sensor outputs. Subsequently, the extracted features are fed into the ResBiGRU component, leveraging residual connections and bidirectional processing to capture the exercise activities’ long-term temporal dependencies and contextual information. The performance of the proposed model is evaluated using the Myogym dataset, comprising data from 10 participants engaged in 30 distinct gym activities. The model achieves a classification accuracy of 97.29% and an F1-score of 92.68%. Ablation studies confirm the effectiveness of the convolutional neural network and ResBiGRU components. The proposed hybrid model uses wearable multimodal sensor data to accurately and efficiently recognize gym exercise activity.\",\"PeriodicalId\":36273,\"journal\":{\"name\":\"Applied System Innovation\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":3.8000,\"publicationDate\":\"2024-07-02\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Applied System Innovation\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.3390/asi7040059\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"COMPUTER SCIENCE, INFORMATION SYSTEMS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Applied System Innovation","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.3390/asi7040059","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
A Residual Deep Learning Method for Accurate and Efficient Recognition of Gym Exercise Activities Using Electromyography and IMU Sensors
The accurate and efficient recognition of gym workout activities using wearable sensors holds significant implications for assessing fitness levels, tailoring personalized training regimens, and overseeing rehabilitation progress. This study introduces CNN-ResBiGRU, a novel deep learning architecture that amalgamates residual and hybrid methodologies, aiming to precisely categorize gym exercises based on multimodal sensor data. The primary goal of this model is to effectively identify various gym workouts by integrating convolutional neural networks, residual connections, and bidirectional gated recurrent units. Raw electromyography and inertial measurement unit data collected from wearable sensors worn by individuals during strength training and gym sessions serve as inputs for the CNN-ResBiGRU model. Initially, convolutional neural network layers are employed to extract unique features in both temporal and spatial dimensions, capturing localized patterns within the sensor outputs. Subsequently, the extracted features are fed into the ResBiGRU component, leveraging residual connections and bidirectional processing to capture the exercise activities’ long-term temporal dependencies and contextual information. The performance of the proposed model is evaluated using the Myogym dataset, comprising data from 10 participants engaged in 30 distinct gym activities. The model achieves a classification accuracy of 97.29% and an F1-score of 92.68%. Ablation studies confirm the effectiveness of the convolutional neural network and ResBiGRU components. The proposed hybrid model uses wearable multimodal sensor data to accurately and efficiently recognize gym exercise activity.