Dimitar Valkov, Frank Steinicke, G. Bruder, K. Hinrichs
{"title":"A multi-touch enabled human-transporter metaphor for virtual 3D traveling","authors":"Dimitar Valkov, Frank Steinicke, G. Bruder, K. Hinrichs","doi":"10.1109/3DUI.2010.5444715","DOIUrl":null,"url":null,"abstract":"In this tech-note we demonstrate how multi-touch hand gestures in combination with foot gestures can be used to perform navigation tasks in interactive 3D environments. Geographic Information Systems (GIS) are well suited as a complex testbed for evaluation of user interfaces based on multi-modal input. Recent developments in the area of interactive surfaces enable the construction of low-cost multi-touch displays and relatively inexpensive sensor technology to detect foot gestures, which allows to explore these input modalities for virtual reality environments. In this tech-note, we describe an intuitive 3D user interface metaphor and corresponding hardware, which combine multi-touch hand and foot gestures for interaction with spatial data.","PeriodicalId":144717,"journal":{"name":"2010 IEEE Symposium on 3D User Interfaces (3DUI)","volume":"8 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2010-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"14","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2010 IEEE Symposium on 3D User Interfaces (3DUI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/3DUI.2010.5444715","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 14
Abstract
In this tech-note we demonstrate how multi-touch hand gestures in combination with foot gestures can be used to perform navigation tasks in interactive 3D environments. Geographic Information Systems (GIS) are well suited as a complex testbed for evaluation of user interfaces based on multi-modal input. Recent developments in the area of interactive surfaces enable the construction of low-cost multi-touch displays and relatively inexpensive sensor technology to detect foot gestures, which allows to explore these input modalities for virtual reality environments. In this tech-note, we describe an intuitive 3D user interface metaphor and corresponding hardware, which combine multi-touch hand and foot gestures for interaction with spatial data.