{"title":"An Ensemble Model using Face and Pose Tracking for Engagement Detection in Game-based Rehabilitation","authors":"Xujie Lin, Siqi Cai, P. Chan, Longhan Xie","doi":"10.1145/3590003.3590085","DOIUrl":null,"url":null,"abstract":"Highly engaging rehabilitation promotes functional reorganization of the brain in stroke patients. Engagement detection in game-based rehabilitation can help rehabilitation practitioners get real-time feedback, and then provide patients with appropriate training programs. Previous research on engagement detection has focused on wearable devices, and the complicated laboratory setup makes them unsuitable for use in clinics and homes. In this work, we propose a method to automatically extract facial and posture features from camera-captured videos. Then we design an automatic engagement detection model using the facial and posture features as the input. In the dataset of engagement in virtual game rehabilitation scenarios, our model detects engagement levels with an average accuracy of 96.85%, achieving remarkable performance. This study sheds new light on engagement detection for stroke patients in clinical applications.","PeriodicalId":340225,"journal":{"name":"Proceedings of the 2023 2nd Asia Conference on Algorithms, Computing and Machine Learning","volume":"12 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-03-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2023 2nd Asia Conference on Algorithms, Computing and Machine Learning","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3590003.3590085","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Highly engaging rehabilitation promotes functional reorganization of the brain in stroke patients. Engagement detection in game-based rehabilitation can help rehabilitation practitioners get real-time feedback, and then provide patients with appropriate training programs. Previous research on engagement detection has focused on wearable devices, and the complicated laboratory setup makes them unsuitable for use in clinics and homes. In this work, we propose a method to automatically extract facial and posture features from camera-captured videos. Then we design an automatic engagement detection model using the facial and posture features as the input. In the dataset of engagement in virtual game rehabilitation scenarios, our model detects engagement levels with an average accuracy of 96.85%, achieving remarkable performance. This study sheds new light on engagement detection for stroke patients in clinical applications.