{"title":"Interactive body part contrast mining for human interaction recognition","authors":"Yanli Ji, Guo Ye, Hong Cheng","doi":"10.1109/ICMEW.2014.6890714","DOIUrl":null,"url":null,"abstract":"The recognition of multi-person interactions still remains a challenge because of the mutual occlusion and redundant poses. We propose an interactive body part contrast mining method based on joints for human interaction recognition. To efficiently describe interactions, we propose an interactive body part model which connects the interactive limbs of different participants to represent the relationship of interactive body parts. Then we calculate the spatial-temporal joint features for 8 interactive limb pairs in a short frame set for motion description (poselets). Employing contrast mining, we determine the essential interactive pairs and poselets for each interaction class to delete the redundant action information, and use these poselets to generate a poselet dictionary for interaction representation following bag-of-words. SVM with RBF kernel is adopted for recognition. We evaluate the proposed algorithm on two databases, the SBU interaction database and a newly collected RGBD-skeleton interaction database. Experiment results indicate the effectiveness of the proposed algorithm. The recognition accuracy reaches 85.4% on our interaction database, and 86.8% on SBU interaction database, 6% higher than the method in [1].","PeriodicalId":178700,"journal":{"name":"2014 IEEE International Conference on Multimedia and Expo Workshops (ICMEW)","volume":"86 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2014-07-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"125","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2014 IEEE International Conference on Multimedia and Expo Workshops (ICMEW)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICMEW.2014.6890714","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 125
Abstract
The recognition of multi-person interactions still remains a challenge because of the mutual occlusion and redundant poses. We propose an interactive body part contrast mining method based on joints for human interaction recognition. To efficiently describe interactions, we propose an interactive body part model which connects the interactive limbs of different participants to represent the relationship of interactive body parts. Then we calculate the spatial-temporal joint features for 8 interactive limb pairs in a short frame set for motion description (poselets). Employing contrast mining, we determine the essential interactive pairs and poselets for each interaction class to delete the redundant action information, and use these poselets to generate a poselet dictionary for interaction representation following bag-of-words. SVM with RBF kernel is adopted for recognition. We evaluate the proposed algorithm on two databases, the SBU interaction database and a newly collected RGBD-skeleton interaction database. Experiment results indicate the effectiveness of the proposed algorithm. The recognition accuracy reaches 85.4% on our interaction database, and 86.8% on SBU interaction database, 6% higher than the method in [1].