{"title":"3-D real-time gesture recognition using proximity spaces","authors":"E. Huber","doi":"10.1109/ACV.1996.572020","DOIUrl":null,"url":null,"abstract":"A 3D segment tracking approach to recognition of human pose and gestures is presented. The author has previously developed and refined a stereo based method, called the proximity space method, for acquiring and maintaining the track of object surfaces in 3-space. This method uses LoG filtered images and relies solely on stereo measurement to spatially distinguish between objects in 3D. The objective of the work is to obtain useful state information about the shape, size, and pose of natural (unadorned) objects in their naturally cluttered environments. Thus, the system does not require or benefit from special markers, colors, or other tailored artifacts. Recently he has extended this method in order to track multiple regions and segments of complex objects. The paper describes techniques for applying the proximity space method to a particularly interesting system: the human. Specifically, he discusses the use of simple models for constraining proximity space behavior in order to track gestures as a person moves through a cluttered environment. It is demonstrated that by observing the behavior of the model, used to tract the human's pose through time, different gestures can be easily recognized. The approach is illustrated through a discussion of gestures used to provide logical and spatial commands to a mobile robot.","PeriodicalId":222106,"journal":{"name":"Proceedings Third IEEE Workshop on Applications of Computer Vision. WACV'96","volume":"50 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1996-12-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"20","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings Third IEEE Workshop on Applications of Computer Vision. WACV'96","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ACV.1996.572020","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 20
Abstract
A 3D segment tracking approach to recognition of human pose and gestures is presented. The author has previously developed and refined a stereo based method, called the proximity space method, for acquiring and maintaining the track of object surfaces in 3-space. This method uses LoG filtered images and relies solely on stereo measurement to spatially distinguish between objects in 3D. The objective of the work is to obtain useful state information about the shape, size, and pose of natural (unadorned) objects in their naturally cluttered environments. Thus, the system does not require or benefit from special markers, colors, or other tailored artifacts. Recently he has extended this method in order to track multiple regions and segments of complex objects. The paper describes techniques for applying the proximity space method to a particularly interesting system: the human. Specifically, he discusses the use of simple models for constraining proximity space behavior in order to track gestures as a person moves through a cluttered environment. It is demonstrated that by observing the behavior of the model, used to tract the human's pose through time, different gestures can be easily recognized. The approach is illustrated through a discussion of gestures used to provide logical and spatial commands to a mobile robot.