Barrett Ens, A. Quigley, H. Yeo, Pourang Irani, M. Billinghurst
{"title":"Multi-scale gestural interaction for augmented reality","authors":"Barrett Ens, A. Quigley, H. Yeo, Pourang Irani, M. Billinghurst","doi":"10.1145/3132787.3132808","DOIUrl":null,"url":null,"abstract":"We present a multi-scale gestural interface for augmented reality applications. With virtual objects, gestural interactions such as pointing and grasping can be convenient and intuitive, however they are imprecise, socially awkward, and susceptible to fatigue. Our prototype application uses multiple sensors to detect gestures from both arm and hand motions (macro-scale), and finger gestures (micro-scale). Micro-gestures can provide precise input through a belt-worn sensor configuration, with the hand in a relaxed posture. We present an application that combines direct manipulation with microgestures for precise interaction, beyond the capabilities of direct manipulation alone.","PeriodicalId":243902,"journal":{"name":"SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications","volume":"17 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-11-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3132787.3132808","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
We present a multi-scale gestural interface for augmented reality applications. With virtual objects, gestural interactions such as pointing and grasping can be convenient and intuitive, however they are imprecise, socially awkward, and susceptible to fatigue. Our prototype application uses multiple sensors to detect gestures from both arm and hand motions (macro-scale), and finger gestures (micro-scale). Micro-gestures can provide precise input through a belt-worn sensor configuration, with the hand in a relaxed posture. We present an application that combines direct manipulation with microgestures for precise interaction, beyond the capabilities of direct manipulation alone.