{"title":"A projection-based mixed-reality display for exterior and interior of a building diorama","authors":"Ming Zhang, I. Kitahara, Y. Kameda, Y. Ohta","doi":"10.1145/2671015.2671132","DOIUrl":"https://doi.org/10.1145/2671015.2671132","url":null,"abstract":"This paper proposes an interactive display system that displays both of the exterior and interior construction of a building diorama by using a projection-based Mixed-Reality (MR) technique, which is useful for understanding the complex construction and the spatial relationships between outside and inside. The users can hold and move the diorama model using their hands/body motion, so that they can observe the model from their favorite viewpoint. Our system obtains both of the user's information (the viewpoint and the gesture) and the diorama model's information (the pose) in 3D space by using two RGB-D cameras. The CG image corresponding to the user's viewpoint, gesture and the pose of the diorama is rendered by Dual Rendering algorithm in real time. As the result, the generated CG image is projected onto the diorama to realize MR display. We confirm the effectiveness of our proposed method by developing a pilot system.","PeriodicalId":93673,"journal":{"name":"Proceedings of the ACM Symposium on Virtual Reality Software and Technology. ACM Symposium on Virtual Reality Software and Technology","volume":"23 1","pages":"211-212"},"PeriodicalIF":0.0,"publicationDate":"2014-11-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"79638746","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
M. Santos, Takafumi Taketomi, C. Sandor, Jarkko Polvi, Goshiro Yamamoto, H. Kato
{"title":"A usability scale for handheld augmented reality","authors":"M. Santos, Takafumi Taketomi, C. Sandor, Jarkko Polvi, Goshiro Yamamoto, H. Kato","doi":"10.1145/2671015.2671019","DOIUrl":"https://doi.org/10.1145/2671015.2671019","url":null,"abstract":"Handheld augmented reality (HAR) applications must be carefully designed and improved based on user feedback to sustain commercial use. However, no standard questionnaire considers perceptual and ergonomic issues found in HAR. We address this issue by creating a HAR Usability Scale (HARUS).\u0000 To create HARUS, we performed a systematic literature review to enumerate user-reported issues in HAR applications. Based on these issues, we created a questionnaire measuring manipulability -- the ease of handling the HAR system, and comprehensibility -- the ease of understanding the information presented by HAR. We then provide evidences of validity and reliability of the HARUS questionnaire by applying it to three experiments. The results show that HARUS consistently correlates with other subjective and objective measures of usability, thereby supporting its concurrent validity. Moreover, HARUS obtained a good Cronbach's alpha in all three experiments, thereby demonstrating internally consistency.\u0000 HARUS, as well as its decomposition into individual manipulability and comprehensibility scores, are evaluation tools that researchers and professionals can use to analyze their HAR applications. By providing such a tool, they can gain quality feedback from users to improve their HAR applications towards commercial success.","PeriodicalId":93673,"journal":{"name":"Proceedings of the ACM Symposium on Virtual Reality Software and Technology. ACM Symposium on Virtual Reality Software and Technology","volume":"22 1","pages":"167-176"},"PeriodicalIF":0.0,"publicationDate":"2014-11-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"83259112","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Braiding hair by braid theory","authors":"Gaoxiang Zeng, T. Komura","doi":"10.1145/2671015.2674962","DOIUrl":"https://doi.org/10.1145/2671015.2674962","url":null,"abstract":"In this paper, we propose a system based on braid theory that help users to generate customized hair braiding, which is a function that is lacking in most existing hair design software. Our user interface for braid design is built upon braid theory, which is a subarea of knot theory in mathematics. The user designs braid patterns using braid index, and specifies the amount of hair for each braid as well as the area over the head where the braid is to be made. Then, the system automatically braids the hair of the character and generates a realistic image of the designed hair style. Theoretically, our system can produce arbitrary kinds of braids. Our system can also judge if two braids are equivalent or not by making use of the transition rules of braid index, which helps to register designed braids to the database. The system is implemented as a Maya plugin, and can be combinedly used with various functions including physical simulation, hair rendering and hair texturing. Our user study shows that our toolkit is easy-to-use for novice users as well as experienced users.","PeriodicalId":93673,"journal":{"name":"Proceedings of the ACM Symposium on Virtual Reality Software and Technology. ACM Symposium on Virtual Reality Software and Technology","volume":"15 1","pages":"215-216"},"PeriodicalIF":0.0,"publicationDate":"2014-11-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"83580246","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Manuel Prätorius, Dimitar Valkov, U. Burgbacher, K. Hinrichs
{"title":"DigiTap: an eyes-free VR/AR symbolic input device","authors":"Manuel Prätorius, Dimitar Valkov, U. Burgbacher, K. Hinrichs","doi":"10.1145/2671015.2671029","DOIUrl":"https://doi.org/10.1145/2671015.2671029","url":null,"abstract":"In this paper we present DigiTap---a wrist-worn device specially designed for symbolic input in virtual and augmented reality (VR/AR) environments. DigiTap is able to robustly sense thumb-to-finger taps on the four fingertips and the eight minor knuckles. These taps are detected by an accelerometer, which triggers capturing of an image sequence with a small wrist-mounted camera. The tap position is then extracted with low computational effort from the images by an image processing pipeline. Thus, the device is very energy efficient and may potentially be integrated in a smartwatch-like device, allowing an unobtrusive, always available, eyes-free input. To demonstrate the feasibility of our approach an initial user study with our prototype device was conducted. In this study the suitability of the twelve tapping locations was evaluated, and the most prominent sources of error were identified. Our prototype system was able to correctly classify 92% of the input locations.","PeriodicalId":93673,"journal":{"name":"Proceedings of the ACM Symposium on Virtual Reality Software and Technology. ACM Symposium on Virtual Reality Software and Technology","volume":"477 1","pages":"9-18"},"PeriodicalIF":0.0,"publicationDate":"2014-11-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"79943939","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Liuyang Zhou, Zhiguang Liu, Howard Leung, Hubert P. H. Shum
{"title":"Posture reconstruction using Kinect with a probabilistic model","authors":"Liuyang Zhou, Zhiguang Liu, Howard Leung, Hubert P. H. Shum","doi":"10.1145/2671015.2671021","DOIUrl":"https://doi.org/10.1145/2671015.2671021","url":null,"abstract":"Recent work has shown that depth image based 3D posture estimation hardware such as Kinect has made interactive applications more popular. However, it is still challenging to accurately recognize postures from a single depth camera due to the inherently noisy data derived from depth images and self-occluding action performed by the user. While previous research has shown that data-driven methods can be used to reconstruct the correct postures, they usually require a large posture database, which greatly limit the usability for systems with constrained hardware such as game console. To solve this problem, we present a new probabilistic framework to enhance the accuracy of the postures live captured by Kinect. We adopt the Gaussian Process model as a prior to leverage position data obtained from Kinect and marker-based motion capture system. We also incorporate a temporal consistency term into the optimization framework to constrain the velocity variations between successive frames. To ensure that the reconstructed posture resembles the observed input data from Kinect when its tracking result is good, we embed joint reliability into the optimization framework. Experimental results demonstrate that our system can generate high quality postures even under severe self-occlusion situations, which is beneficial for real-time posture based applications such as motion-based gaming and sport training.","PeriodicalId":93673,"journal":{"name":"Proceedings of the ACM Symposium on Virtual Reality Software and Technology. ACM Symposium on Virtual Reality Software and Technology","volume":"1 1","pages":"117-125"},"PeriodicalIF":0.0,"publicationDate":"2014-11-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"91146888","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"The influence of step frequency on the range of perceptually natural visual walking speeds during walking-in-place and treadmill locomotion","authors":"N. C. Nilsson, S. Serafin, R. Nordahl","doi":"10.1145/2671015.2671113","DOIUrl":"https://doi.org/10.1145/2671015.2671113","url":null,"abstract":"Walking-In-Place (WIP) techniques make relatively natural walking experiences within immersive virtual environments possible when the physical interaction space is limited in size. In order to facilitate such experiences it is necessary to establish a natural connection between steps in place and virtual walking speeds. This paper details a study investigating the effects of movement type (treadmill walking and WIP) and step frequency (1.4, 1.8 and 2.2 steps per second) on the range of perceptually natural visual walking speeds. The results suggests statistically significant main effects of both movement type and step frequency but no significant interaction between the two variables.","PeriodicalId":93673,"journal":{"name":"Proceedings of the ACM Symposium on Virtual Reality Software and Technology. ACM Symposium on Virtual Reality Software and Technology","volume":"29 1","pages":"187-190"},"PeriodicalIF":0.0,"publicationDate":"2014-11-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"88645154","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Simulator sickness and presence using HMDs: comparing use of a game controller and a position estimation system","authors":"G. Llorach, A. Evans, J. Blat","doi":"10.1145/2671015.2671120","DOIUrl":"https://doi.org/10.1145/2671015.2671120","url":null,"abstract":"Consumer-grade head-mounted displays (HMD) such as the Oculus Rift have become increasingly available for Virtual Reality recently. Their high degree of immersion and presence provokes usually amazement when first used. Nevertheless, HMDs also have been reported to cause adverse reactions such as simulator sickness. As their impact is growing, it is important to understand such side effects. This paper presents the results of a relatively large scale user experiment which compares using a conventional game controller versus positioning in the virtual world based upon the signal of the internal Inertial Measurement Unit (IMU) using Oculus Rift DK1. We show that simulator sickness is significantly reduced when using a position estimation system rather than using the more traditional game controller for navigation. However the sense of presence was not enhanced by the possibility of 'real walking'. We also show the impact of other factors, such as prior experience or motion history, and discuss the results.","PeriodicalId":93673,"journal":{"name":"Proceedings of the ACM Symposium on Virtual Reality Software and Technology. ACM Symposium on Virtual Reality Software and Technology","volume":"121 ","pages":"137-140"},"PeriodicalIF":0.0,"publicationDate":"2014-11-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"91457013","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Steffen Gauglitz, B. Nuernberger, M. Turk, Tobias Höllerer
{"title":"In touch with the remote world: remote collaboration with augmented reality drawings and virtual navigation","authors":"Steffen Gauglitz, B. Nuernberger, M. Turk, Tobias Höllerer","doi":"10.1145/2671015.2671016","DOIUrl":"https://doi.org/10.1145/2671015.2671016","url":null,"abstract":"Augmented reality annotations and virtual scene navigation add new dimensions to remote collaboration. In this paper, we present a touchscreen interface for creating freehand drawings as world-stabilized annotations and for virtually navigating a scene reconstructed live in 3D, all in the context of live remote collaboration. Two main focuses of this work are (1) automatically inferring depth for 2D drawings in 3D space, for which we evaluate four possible alternatives, and (2) gesture-based virtual navigation designed specifically to incorporate constraints arising from partially modeled remote scenes. We evaluate these elements via qualitative user studies, which in addition provide insights regarding the design of individual visual feedback elements and the need to visualize the direction of drawings.","PeriodicalId":93673,"journal":{"name":"Proceedings of the ACM Symposium on Virtual Reality Software and Technology. ACM Symposium on Virtual Reality Software and Technology","volume":"172 1","pages":"197-205"},"PeriodicalIF":0.0,"publicationDate":"2014-11-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"79517363","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A portable interface for tangible exploration of volumetric data","authors":"Paul Issartel, F. Guéniat, M. Ammi","doi":"10.1145/2671015.2671130","DOIUrl":"https://doi.org/10.1145/2671015.2671130","url":null,"abstract":"Exploration of volumetric data is an essential task in many scientific fields. However, the use of standard devices, such as the 2D mouse, leads to suboptimal interaction mappings. Several VR systems provide better interaction capabilities, but they remain dedicated and expensive solutions. In this work, we propose an interface that combines tangible tools and a handheld device. This configuration allows natural and full 6 DOF interaction in a convenient, fully portable and affordable system. This paper presents our design choices for this interface and associated tangible exploration techniques.","PeriodicalId":93673,"journal":{"name":"Proceedings of the ACM Symposium on Virtual Reality Software and Technology. ACM Symposium on Virtual Reality Software and Technology","volume":"92 1","pages":"209-210"},"PeriodicalIF":0.0,"publicationDate":"2014-11-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"79424675","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"FingerOscillation: clutch-free techniques for 3D object translation, rotation and scale","authors":"Siju Wu, A. Chellali, S. Otmane","doi":"10.1145/2671015.2671117","DOIUrl":"https://doi.org/10.1145/2671015.2671117","url":null,"abstract":"In this paper, we present three freehand interaction techniques for 3D content manipulation, the FingerShake, for object translation, the FingerRotate, for object rotation and the FingerSwing for object scale. These three techniques refer to a more generic concept which we call FingerOscillation. The main contribution is to interact with the machine by using finger oscillation movements. We introduce the design and the implementation of these techniques.","PeriodicalId":93673,"journal":{"name":"Proceedings of the ACM Symposium on Virtual Reality Software and Technology. ACM Symposium on Virtual Reality Software and Technology","volume":"26 1","pages":"225-226"},"PeriodicalIF":0.0,"publicationDate":"2014-11-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"80049287","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}