{"title":"KHAKI: A Hemispherical, Multi-Function Input Device for 3D operation","authors":"Takumi Ohshima, T. Serizawa, Y. Yanagida","doi":"10.1109/3DUI.2010.5444697","DOIUrl":"https://doi.org/10.1109/3DUI.2010.5444697","url":null,"abstract":"We have developed KHAKI, a three-dimensional, hemispherical, multi-function input device. KHAKI can measure the position and orientation of the entire hand, together with the position, orientation, pressure, and shear force of each finger. KHAKI provides seamless switching among character input, pointing, and force input.","PeriodicalId":144717,"journal":{"name":"2010 IEEE Symposium on 3D User Interfaces (3DUI)","volume":"12 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130818187","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Florent Berthaut, M. Hachet, M. Desainte-Catherine
{"title":"Piivert: Percussion-based interaction for immersive virtual environments","authors":"Florent Berthaut, M. Hachet, M. Desainte-Catherine","doi":"10.1109/3DUI.2010.5444728","DOIUrl":"https://doi.org/10.1109/3DUI.2010.5444728","url":null,"abstract":"3D graphical interaction offers a large amount of possibilities for musical applications. However it also carries several limitations that prevent it from being used as an efficient musical instrument. For example, input devices for 3D interaction or new gaming devices are usually based on 3 or 6 degrees-of-freedom tracking combined with push-buttons or joysticks. While buttons and joysticks do not provide good resolution for musical gestures, graphical interaction using tracking may bring enough expressivity but is weakened by accuracy and haptic feedback problems. Moreover, interaction based solely on tracking limit the possibilities brought by graphical interfaces in terms of musical gestures. We propose a new approach that separates the input modalities according to traditional musical gestures. This allows to combine the possibilities of graphical interaction as selection and modulation gestures with the accuracy and the expressivity of musical interaction as excitation gestures. We implement this approach with a new input device, Piivert, which combines 6DOF tracking and pressure detection. We describe associated interaction techniques and show how this new device can be valuable for immersive musical applications.","PeriodicalId":144717,"journal":{"name":"2010 IEEE Symposium on 3D User Interfaces (3DUI)","volume":"52 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122003332","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Passive hand pose recognition in virtual reality","authors":"Victor Zappi, A. Brogni, D. Caldwell","doi":"10.1109/3DUI.2010.5444698","DOIUrl":"https://doi.org/10.1109/3DUI.2010.5444698","url":null,"abstract":"In this Poster we present a hand pose recognition interface based on passive tracking: currently four poses can be detected on both hands, using a small marker set, attached on fingers and on the back of the hands. This user interface is designed to support a natural way of interaction with objects in Virtual Reality, permitting users to quickly switch among selection, translation and rotation commands. The interface described here is part of a multimodal platform where sound parameters can be changed without the use of musical instruments or controllers, but through free hand mesh manipulation.","PeriodicalId":144717,"journal":{"name":"2010 IEEE Symposium on 3D User Interfaces (3DUI)","volume":"85 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126244305","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"3-D facial expressions modulate perception of emotive voices","authors":"Kota Arai, M. Kitazaki","doi":"10.1109/3DUI.2010.5444704","DOIUrl":"https://doi.org/10.1109/3DUI.2010.5444704","url":null,"abstract":"Facial expressions and emotive voices are not processed independently, but interact implicitly. We investigated effects of facial expressions by 2-dimensional (2D) and 3-dimensional (3D) faces on perceived emotions of non-sense voices using psychophysical methods. We found that the perceived emotion of voices was facilitated with 3D faces, and was modulated in the direction of presented facial expressions. These results suggest that 3D facial expressions may be more effective to communicate cross-modal emotions than the 2D facial expressions.","PeriodicalId":144717,"journal":{"name":"2010 IEEE Symposium on 3D User Interfaces (3DUI)","volume":"8 6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125001973","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Ryan P. McMahan, Alexander Joel D. Alon, Shaimaa Y. Lazem, R. Beaton, D. Machaj, Michael Schaefer, Mara G. Silva, Anamary Leal, R. Hagan, D. Bowman
{"title":"Evaluating natural interaction techniques in video games","authors":"Ryan P. McMahan, Alexander Joel D. Alon, Shaimaa Y. Lazem, R. Beaton, D. Machaj, Michael Schaefer, Mara G. Silva, Anamary Leal, R. Hagan, D. Bowman","doi":"10.1109/3DUI.2010.5444727","DOIUrl":"https://doi.org/10.1109/3DUI.2010.5444727","url":null,"abstract":"Despite the gaming industry's recent trend for using “natural” interaction techniques, which mimic real world actions with a high level of fidelity, it is not clear how natural interaction techniques affect the player experience. In order to obtain a better understanding, we designed and conducted a study using Mario Kart Wii, a commercial racing game for the Nintendo Wii. We chose this platform due to its seemingly balanced design of both natural and non-natural interaction techniques. Our empirical study of these techniques found that the non-natural interaction techniques significantly outperform their more natural counterparts. We offer three hypotheses to explain our finding and suggest them as important interaction design considerations.","PeriodicalId":144717,"journal":{"name":"2010 IEEE Symposium on 3D User Interfaces (3DUI)","volume":"51 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134084844","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Redirected touching: Warping space to remap passive haptics","authors":"Luv Kohli","doi":"10.1109/3DUI.2010.5444703","DOIUrl":"https://doi.org/10.1109/3DUI.2010.5444703","url":null,"abstract":"There is an increasing interest in deployable virtual military training systems. Haptic feedback for these training systems can enable users to interact more naturally with the training environment, but is difficult to deploy. Passive haptic feedback is very compelling, but it is also inflexible. Changes made to virtual objects can require time-consuming changes to their physical passive-haptic counterparts. This poster explores the possibility of mapping many differently shaped virtual objects onto one physical object by warping virtual space and exploiting the dominance of the visual system. A first implementation that maps different virtual objects onto dynamically captured physical geometry is presented, and potential applications to deployable military trainers are discussed.","PeriodicalId":144717,"journal":{"name":"2010 IEEE Symposium on 3D User Interfaces (3DUI)","volume":"126 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133291139","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Ryo Fukazawa, Kazuki Takashima, Garth B. D. Shoemaker, Y. Kitamura, Yuichi Itoh, F. Kishino
{"title":"Comparison of multimodal interactions in perspective-corrected multi-display environment","authors":"Ryo Fukazawa, Kazuki Takashima, Garth B. D. Shoemaker, Y. Kitamura, Yuichi Itoh, F. Kishino","doi":"10.1109/3DUI.2010.5444711","DOIUrl":"https://doi.org/10.1109/3DUI.2010.5444711","url":null,"abstract":"This paper compares multi-modal interaction techniques in a perspective-corrected multi-display environment (MDE). The performance of multimodal interactions using gestures, eye gaze, and head direction are experimentally examined in an object manipulation task in MDEs and compared with a mouse operated perspective cursor. Experimental results showed that gesture-based multimodal interactions provide performance equivalent in task completion time to mouse-based perspective cursors. A technique utilizing user head direction received positive comments from subjects even though it was not as fast. Based on the experimental results and observations, we discuss the potential of multimodal interaction techniques in MDEs.","PeriodicalId":144717,"journal":{"name":"2010 IEEE Symposium on 3D User Interfaces (3DUI)","volume":"33 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134053476","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"An empirical evaluation of virtual hand techniques for 3D object manipulation in a tangible augmented reality environment","authors":"Taejin Ha, Woontack Woo","doi":"10.1109/3DUI.2010.5444713","DOIUrl":"https://doi.org/10.1109/3DUI.2010.5444713","url":null,"abstract":"In this paper, we present a Fitts' law-based formal evaluation process and the corresponding results for 3D object manipulation techniques based on a virtual hand metaphor in a tangible augmented reality (TAR) environment. Specifically, we extend the design parameters of the 1D scale Fitts' law to 3D scale and then refine an evaluation model in order to bring generality and ease of adaptation to various TAR applications. Next, we implement and compare standard TAR manipulation techniques using a cup, a paddle, a cube, and a proposed extended paddle prop. Most manipulation techniques were well-modeled in terms of linear regression according to Fitts' law, with a correlation coefficient value of over 0.9. Notably, the throughput by ISO 9241-9 of the extended paddle technique peaked at around 1.39 to 2 times higher than in the other techniques, due to the instant 3D positioning of the 3D objects. In the discussion, we subsequently examine the characteristics of the TAR manipulation techniques in terms of stability, speed, comfort, and understanding. As a result, our evaluation process, results, and analysis can be useful in guiding the design and implementation of future TAR interfaces.","PeriodicalId":144717,"journal":{"name":"2010 IEEE Symposium on 3D User Interfaces (3DUI)","volume":"490 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124152675","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
G. Bruder, Frank Steinicke, Dimitar Valkov, K. Hinrichs
{"title":"Immersive virtual studio for architectural exploration","authors":"G. Bruder, Frank Steinicke, Dimitar Valkov, K. Hinrichs","doi":"10.1109/3DUI.2010.5444705","DOIUrl":"https://doi.org/10.1109/3DUI.2010.5444705","url":null,"abstract":"Architects use a variety of analog and digital tools and media to plan and design constructions. Immersive virtual reality (VR) technologies have shown great potential for architectural design, especially for exploration and review of design proposals. In this work we propose a virtual studio system, which allows architects and clients to use arbitrary real-world tools such as maps or rulers during immersive exploration of virtual 3D models. The user interface allows architects and clients to review designs and compose 3D architectural scenes, combining benefits of mixed-reality environments with immersive head-mounted display setups.","PeriodicalId":144717,"journal":{"name":"2010 IEEE Symposium on 3D User Interfaces (3DUI)","volume":"17 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125346307","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Robust vision-based hand tracking using single camera for ubiquitous 3D gesture interaction","authors":"S. Rodríguez-Vaamonde, A. Picón, Aritz Villodas","doi":"10.1109/3DUI.2010.5444702","DOIUrl":"https://doi.org/10.1109/3DUI.2010.5444702","url":null,"abstract":"Along History, the gestures have been one of the most natural interaction methods of the Human Being. This work proposes a novel 3D interaction technique based on computer vision for human-computer gesture interaction. The main contribution to the interaction field is that this technique implements and improves on-the-edge computer vision algorithms offering a low cost solution providing robustness against scenario and user diversity. Thanks to these characteristics, this system allows the human-computer ubiquitous interaction in a robust way.","PeriodicalId":144717,"journal":{"name":"2010 IEEE Symposium on 3D User Interfaces (3DUI)","volume":"50 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124351649","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}