{"title":"Towards Perceptual Interface for Visualization Navigation of Large Data Sets","authors":"M. Shin, L. Tsap, Dmitry Goldgof","doi":"10.1109/CVPRW.2003.10045","DOIUrl":null,"url":null,"abstract":"This paper presents a perceptual interface for visualization navigation using gesture recognition. Scientists are interested in developing interactive settings for exploring large data sets in an intuitive environment. The input consists of registered 3-D data. Bezier curves are used for trajectory analysis and classification of gestures. The method is robust and reliable: correct hand identification rate is 99.9% (from 1641 frames), modes of hand movements are correct 95.6% of the time, recognition rate (given the right mode) is 97.9%. An application to gesture-controlled visualization is also presented. The paper advances the state-of-the-art of human-computer interaction with a robust attachment- and marker-free gestural information processing for visualization.","PeriodicalId":121249,"journal":{"name":"2003 Conference on Computer Vision and Pattern Recognition Workshop","volume":"25 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2003-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2003 Conference on Computer Vision and Pattern Recognition Workshop","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CVPRW.2003.10045","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4
Abstract
This paper presents a perceptual interface for visualization navigation using gesture recognition. Scientists are interested in developing interactive settings for exploring large data sets in an intuitive environment. The input consists of registered 3-D data. Bezier curves are used for trajectory analysis and classification of gestures. The method is robust and reliable: correct hand identification rate is 99.9% (from 1641 frames), modes of hand movements are correct 95.6% of the time, recognition rate (given the right mode) is 97.9%. An application to gesture-controlled visualization is also presented. The paper advances the state-of-the-art of human-computer interaction with a robust attachment- and marker-free gestural information processing for visualization.