{"title":"Precise Identification of Rehabilitation Actions using AI based Strategy","authors":"Mingjuan Lei, Peng Liu, Qingshan Wang, Qi Wang","doi":"10.1109/ICCCN49398.2020.9209617","DOIUrl":null,"url":null,"abstract":"With the development of microelectronics and sensor technologies, there are more and more researchers applying them to human action recognition, most of which are professional motion and rely on specific-designed sensors and wearable de-vices. Meanwhile, the need of rehabilitation training is increasing due to occupational diseases, bad life-style and incorrect exercise habit. However, it is costly and inconvenient to train in clinics and hospitals. To buy or borrow a set of medical training equipment is also unpractical. In this paper, we propose to use smart phones, which have larger computing power and are equipped with richer sensors ever than before, to run artificial intelligence based models and algorithms for identification of rehabilitation actions. Beyond doubt, it will be more convenient to use smart phones instead of professional equipments. Nevertheless, there are still some challenges which prevent it from being put into practice, such as phone deployment, data collection, and model training. We initially conceptualize and implement a smart phone-based accuracy judgment system for rehabilitation action. According to the characteristics of the system, e.g., sensor difference, position variation, and computing power limitation, a supervised and data-sharing learning algorithm is proposed, the operation framework, loss function and regular expression function are carefully selected The experiment on a prototype of the system verifies that the proposed method precisely identifies the rehabilitation actions of testees.","PeriodicalId":137835,"journal":{"name":"2020 29th International Conference on Computer Communications and Networks (ICCCN)","volume":"35 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 29th International Conference on Computer Communications and Networks (ICCCN)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICCCN49398.2020.9209617","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
With the development of microelectronics and sensor technologies, there are more and more researchers applying them to human action recognition, most of which are professional motion and rely on specific-designed sensors and wearable de-vices. Meanwhile, the need of rehabilitation training is increasing due to occupational diseases, bad life-style and incorrect exercise habit. However, it is costly and inconvenient to train in clinics and hospitals. To buy or borrow a set of medical training equipment is also unpractical. In this paper, we propose to use smart phones, which have larger computing power and are equipped with richer sensors ever than before, to run artificial intelligence based models and algorithms for identification of rehabilitation actions. Beyond doubt, it will be more convenient to use smart phones instead of professional equipments. Nevertheless, there are still some challenges which prevent it from being put into practice, such as phone deployment, data collection, and model training. We initially conceptualize and implement a smart phone-based accuracy judgment system for rehabilitation action. According to the characteristics of the system, e.g., sensor difference, position variation, and computing power limitation, a supervised and data-sharing learning algorithm is proposed, the operation framework, loss function and regular expression function are carefully selected The experiment on a prototype of the system verifies that the proposed method precisely identifies the rehabilitation actions of testees.