{"title":"Dense 3D Optical Flow Co-occurrence Matrices for Human Activity Recognition","authors":"Rawya Al-Akam, D. Paulus","doi":"10.1145/3266157.3266220","DOIUrl":null,"url":null,"abstract":"In this paper, a new activity recognition technique is introduced based on the gray level co-occurrence matrices (GLCM) from a 3D dense optical flow of the input RGB and Depth videos. These matrices are one of the earliest techniques used for image texture analysis which are representing the distribution of the intensities and information about relative positions of neighboring pixels of an image. In this work, we propose a new method to extract feature vector values using the well-known Haralick features from GLCM matrices to describe the flow pattern by measuring meaningful properties such as energy, contrast, homogeneity, entropy, correlation and sum average to capture local spatial and temporal characteristics of the motion through the neighboring optical flow orientation and magnitude. To evaluate the proposed method and improve the activity recognition problem, we apply a recognition pipeline that involves the bag of local spatial and temporal features and three types of machine learning classifiers are used for comparing the recognition accuracy rate of our method. These classifiers are random forest, support vector machine and K-nearest neighbor. The experimental results carried on two well-known datasets (Gaming datasets (G3D) and Cornell Activity Datasets (CAD-60)), which demonstrate that our method outperforms the results achieved by several widely employed spatial and temporal feature descriptors methods.","PeriodicalId":151070,"journal":{"name":"Proceedings of the 5th International Workshop on Sensor-based Activity Recognition and Interaction","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2018-09-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 5th International Workshop on Sensor-based Activity Recognition and Interaction","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3266157.3266220","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 6
Abstract
In this paper, a new activity recognition technique is introduced based on the gray level co-occurrence matrices (GLCM) from a 3D dense optical flow of the input RGB and Depth videos. These matrices are one of the earliest techniques used for image texture analysis which are representing the distribution of the intensities and information about relative positions of neighboring pixels of an image. In this work, we propose a new method to extract feature vector values using the well-known Haralick features from GLCM matrices to describe the flow pattern by measuring meaningful properties such as energy, contrast, homogeneity, entropy, correlation and sum average to capture local spatial and temporal characteristics of the motion through the neighboring optical flow orientation and magnitude. To evaluate the proposed method and improve the activity recognition problem, we apply a recognition pipeline that involves the bag of local spatial and temporal features and three types of machine learning classifiers are used for comparing the recognition accuracy rate of our method. These classifiers are random forest, support vector machine and K-nearest neighbor. The experimental results carried on two well-known datasets (Gaming datasets (G3D) and Cornell Activity Datasets (CAD-60)), which demonstrate that our method outperforms the results achieved by several widely employed spatial and temporal feature descriptors methods.