G. Stavropoulos, Dimitrios Giakoumis, K. Moustakas, D. Tzovaras
{"title":"Automatic action recognition for assistive robots to support MCI patients at home","authors":"G. Stavropoulos, Dimitrios Giakoumis, K. Moustakas, D. Tzovaras","doi":"10.1145/3056540.3076185","DOIUrl":null,"url":null,"abstract":"This paper presents a novel approach for automatic human action recognition, focusing on user behaviour monitoring needs of assistive robots that aim to support Mild Cognitive Impairment (MCI) patients at home. Our action recognition method utilizes the human's skeleton joints information, extracted from a low-cost depth sensor mounted on a service robot. Herein, we extend the state of art EigenJoints descriptor to improve recognition robustness for a series of actions involved in common daily activities. Specifically, we introduce novel features, so as to take into account action specificities such as the joints' travelled distance and their evolution trend in subsequent frames to the reference one. In addition, we use information related to the user's manipulated objects, taking into account that several actions may be similar, yet performed with different objects, as well as the fact that real, practical applications involve continuous input video streams rather than pre-segmented action sequences. Through experimental evaluation on the MSR Action3D dataset, our approach has been found to outperform the state of art in action recognition performance. Evaluation has also been performed on a custom dataset, providing further promising results for future practical applications of our overall action recognition framework.","PeriodicalId":140232,"journal":{"name":"Proceedings of the 10th International Conference on PErvasive Technologies Related to Assistive Environments","volume":"5 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-06-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"16","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 10th International Conference on PErvasive Technologies Related to Assistive Environments","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3056540.3076185","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 16
Abstract
This paper presents a novel approach for automatic human action recognition, focusing on user behaviour monitoring needs of assistive robots that aim to support Mild Cognitive Impairment (MCI) patients at home. Our action recognition method utilizes the human's skeleton joints information, extracted from a low-cost depth sensor mounted on a service robot. Herein, we extend the state of art EigenJoints descriptor to improve recognition robustness for a series of actions involved in common daily activities. Specifically, we introduce novel features, so as to take into account action specificities such as the joints' travelled distance and their evolution trend in subsequent frames to the reference one. In addition, we use information related to the user's manipulated objects, taking into account that several actions may be similar, yet performed with different objects, as well as the fact that real, practical applications involve continuous input video streams rather than pre-segmented action sequences. Through experimental evaluation on the MSR Action3D dataset, our approach has been found to outperform the state of art in action recognition performance. Evaluation has also been performed on a custom dataset, providing further promising results for future practical applications of our overall action recognition framework.