M. Bergh, Daniel Carton, R. D. Nijs, N. Mitsou, Christian Landsiedel, K. Kühnlenz, D. Wollherr, L. Gool, M. Buss
{"title":"Real-time 3D hand gesture interaction with a robot for understanding directions from humans","authors":"M. Bergh, Daniel Carton, R. D. Nijs, N. Mitsou, Christian Landsiedel, K. Kühnlenz, D. Wollherr, L. Gool, M. Buss","doi":"10.1109/ROMAN.2011.6005195","DOIUrl":null,"url":null,"abstract":"This paper implements a real-time hand gesture recognition algorithm based on the inexpensive Kinect sensor. The use of a depth sensor allows for complex 3D gestures where the system is robust to disturbing objects or persons in the background. A Haarlet-based hand gesture recognition system is implemented to detect hand gestures in any orientation, and more in particular pointing gestures while extracting the 3D pointing direction. The system is integrated on an interactive robot (based on ROS), allowing for real-time hand gesture interaction with the robot. Pointing gestures are translated into goals for the robot, telling him where to go. A demo scenario is presented where the robot looks for persons to interact with, asks for directions, and then detects a 3D pointing direction. The robot then explores his vicinity in the given direction and looks for a new person to interact with.","PeriodicalId":408015,"journal":{"name":"2011 RO-MAN","volume":"79 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2011-08-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"205","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2011 RO-MAN","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ROMAN.2011.6005195","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 205
Abstract
This paper implements a real-time hand gesture recognition algorithm based on the inexpensive Kinect sensor. The use of a depth sensor allows for complex 3D gestures where the system is robust to disturbing objects or persons in the background. A Haarlet-based hand gesture recognition system is implemented to detect hand gestures in any orientation, and more in particular pointing gestures while extracting the 3D pointing direction. The system is integrated on an interactive robot (based on ROS), allowing for real-time hand gesture interaction with the robot. Pointing gestures are translated into goals for the robot, telling him where to go. A demo scenario is presented where the robot looks for persons to interact with, asks for directions, and then detects a 3D pointing direction. The robot then explores his vicinity in the given direction and looks for a new person to interact with.