Manuel Meier, Paul Streli, A. Fender, Christian Holz
{"title":"Demonstrating the Use of Rapid Touch Interaction in Virtual Reality for Prolonged Interaction in Productivity Scenarios","authors":"Manuel Meier, Paul Streli, A. Fender, Christian Holz","doi":"10.1109/VRW52623.2021.00263","DOIUrl":null,"url":null,"abstract":"Current camera-based VR headsets support free-hand mid-air inter-action or physical hand-held controllers for input, which can lead to fatigue during use, as users lack support for their arms and hands between interactions. In our demonstration, we showcase a novel approach to bring quick touch interaction to Virtual Reality, illustrating the beneficial use of rapid tapping, typing, and surface gestures for ongoing interaction in Virtual Reality, particularly in the con-text of content creation and productivity scenarios. The productivity scenarios that become possible using our approach are therefore reminiscent of apps that exist on today’s phones and tablets. To reliably make touch interaction work in VR, we use a wrist-worn prototype to complement the optical hand tracking from VR headsets with inertial sensing to detect touch events on surfaces. Our prototype band TapID integrates a pair of inertial sensors in a flexible strap, from whose signals TapID reliably detects surface touch events and identifies the finger used for touch. This event detection is then fused with the optically tracked hand poses to trigger input in VR. Our demonstration comprises a series of VR applications, including UI control in word processors, web browsers, and document editors.","PeriodicalId":256204,"journal":{"name":"2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"17 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/VRW52623.2021.00263","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
Current camera-based VR headsets support free-hand mid-air inter-action or physical hand-held controllers for input, which can lead to fatigue during use, as users lack support for their arms and hands between interactions. In our demonstration, we showcase a novel approach to bring quick touch interaction to Virtual Reality, illustrating the beneficial use of rapid tapping, typing, and surface gestures for ongoing interaction in Virtual Reality, particularly in the con-text of content creation and productivity scenarios. The productivity scenarios that become possible using our approach are therefore reminiscent of apps that exist on today’s phones and tablets. To reliably make touch interaction work in VR, we use a wrist-worn prototype to complement the optical hand tracking from VR headsets with inertial sensing to detect touch events on surfaces. Our prototype band TapID integrates a pair of inertial sensors in a flexible strap, from whose signals TapID reliably detects surface touch events and identifies the finger used for touch. This event detection is then fused with the optically tracked hand poses to trigger input in VR. Our demonstration comprises a series of VR applications, including UI control in word processors, web browsers, and document editors.