Sevinc Eroglu, Sascha Gebhardt, Patric Schmitz, Dominik Rausch, T. Kuhlen
{"title":"Fluid Sketching―Immersive Sketching Based on Fluid Flow","authors":"Sevinc Eroglu, Sascha Gebhardt, Patric Schmitz, Dominik Rausch, T. Kuhlen","doi":"10.1109/VR.2018.8446595","DOIUrl":"https://doi.org/10.1109/VR.2018.8446595","url":null,"abstract":"Fluid artwork refers to works of art based on the aesthetics of fluid motion, such as smoke photography, ink injection into water, and paper marbling. Inspired by such types of art, we created Fluid Sketching as a novel medium for creating 3D fluid artwork in immersive virtual environments. It allows artists to draw 3D fluid-like sketches and manipulate them via six degrees of freedom input devices. Different brush stroke settings are available, varying the characteristics of the fluid. Because of fluids' nature, the diffusion of the drawn fluid sketch is animated, and artists have control over altering the fluid properties and stopping the diffusion process whenever they are satisfied with the current result. Furthermore, they can shape the drawn sketch by directly interacting with it, either with their hand or by blowing into the fluid. We rely on particle advection via curl-noise as a fast procedural method for animating the fluid flow.","PeriodicalId":355048,"journal":{"name":"2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"13 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114751658","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Chris G. Christou, Despina Michael-Grigoriou, Dimitris Sokratous
{"title":"Virtual Buzzwire: Assessment of a Prototype VR Game for Stroke Rehabilitation","authors":"Chris G. Christou, Despina Michael-Grigoriou, Dimitris Sokratous","doi":"10.1109/VR.2018.8446535","DOIUrl":"https://doi.org/10.1109/VR.2018.8446535","url":null,"abstract":"We created a VR version of the Buzzwire children's toy as part of a project to develop tools for assessment and rehabilitation of upper-body motor skills for people with dexterity impairment after stroke. In two pilot studies, participants wearing a HMD used a hand-held wand with precision tracking to traverse virtual ‘wires'. In the first study, we compared able-bodied participant's performance with and without binocular viewing to establish a connection with previous experiments using physical versions of the game. Furthermore, we show that our extended measures were could also discern differences between subjects' dominant versus non-dominant hand. In a second study, we assessed the usability of the system on a small sample of subjects with post-stroke hemiparesis. There was positive acceptance of the technology with no fatigue or nausea and measurements highlighted the differences between the hemiparetic and unaffected hand.","PeriodicalId":355048,"journal":{"name":"2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"67 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116970236","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Model Retrieval by 3D Sketching in Immersive Virtual Reality","authors":"D. Giunchi, Stuart James, A. Steed","doi":"10.1109/VR.2018.8446609","DOIUrl":"https://doi.org/10.1109/VR.2018.8446609","url":null,"abstract":"We describe a novel method for searching 3D model collections using free-form sketches within a virtual environment as queries. As opposed to traditional Sketch Retrieval, our queries are drawn directly onto an example model. Using immersive virtual reality the user can express their query through a sketch that demonstrates the desired structure, color and texture. Unlike previous sketch-based retrieval methods, users remain immersed within the environment without relying on textual queries or 2D projections which can disconnect the user from the environment. We show how a convolutional neural network (CNN) can create multi-view representations of colored 3D sketches. Using such a descriptor representation, our system is able to rapidly retrieve models and in this way, we provide the user with an interactive method of navigating large object datasets. Through a preliminary user study we demonstrate that by using our VR 3D model retrieval system, users can perform quick and intuitive search. Using our system users can rapidly populate a virtual environment with specific models from a very large database, and thus the technique has the potential to be broadly applicable in immersive editing systems.","PeriodicalId":355048,"journal":{"name":"2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"37 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123179208","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Dario Pavllo, Thibault Porssut, B. Herbelin, R. Boulic
{"title":"Real-Time Marker-Based Finger Tracking with Neural Networks","authors":"Dario Pavllo, Thibault Porssut, B. Herbelin, R. Boulic","doi":"10.1109/VR.2018.8446173","DOIUrl":"https://doi.org/10.1109/VR.2018.8446173","url":null,"abstract":"Hands in virtual reality applications represent our primary means for interacting with the environment. Although marker-based motion capture with inverse kinematics (IK) works for body tracking, it is less reliable for fingers often occluded when captured with cameras. Many computer vision and virtual reality applications circumvent the problem by using an additional system (e.g. inertial trackers). We explore an alternative solution that tracks hands and fingers using solely a motion capture system based on cameras and active markers with machine learning techniques. Our animation of fingers is performed by a predictive model based on neural networks, which is trained on a movements dataset acquired from several subjects with a complementary capture system (inertial). The system is as efficient as a traditional IK algorithm, provides a natural reconstruction of postures, and handles occlusions.","PeriodicalId":355048,"journal":{"name":"2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125142246","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Simulated Reference Frame: A Cost-Effective Solution to Improve Spatial Orientation in VR","authors":"Thinh Nguyen-Vo, B. Riecke, W. Stuerzlinger","doi":"10.1109/VR.2018.8446383","DOIUrl":"https://doi.org/10.1109/VR.2018.8446383","url":null,"abstract":"Virtual Reality (VR) is increasingly used in spatial cognition research, as it offers high experimental control in naturalistic multimodal environments, which is hard to achieve in real-world settings. Although recent technological advances offer a high level of photorealism, locomotion in VR is still restricted because people might not perceive their self-motion as they would in the real world. This might be related to the inability to use embodied spatial orientation processes, which support automatic and obligatory updating of our spatial awareness. Previous research has identified the roles reference frames play in retaining spatial orientation. Here, we propose using visually overlaid rectangular boxes, simulating reference frames in VR, to provide users with a better insight into spatial direction in landmark-free virtual environments. The current mixed-method study investigated how different variations of the visually simulated reference frames might support people in a navigational search task. Performance results showed that the existence of a simulated reference frame yields significant effects on participants completion time and travel distance. Though a simulated CAVE translating with the navigator (one of the simulated reference frames) did not provide significant benefits, the simulated room (another simulated reference frame depicting a rest frame) significantly boosted user performance in the task as well as improved participants preference in the post-experiment evaluation. Results suggest that adding a visually simulated reference frame to VR applications might be a cost-effective solution to the spatial disorientation problem in VR.","PeriodicalId":355048,"journal":{"name":"2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130605496","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
A. Rodrigues, Mario Nagamura, Luis Gustavo Freire da Costa, M. Zuffo
{"title":"Batmen Forever: Unified Virtual Hand Metaphor for Consumer VR Setups","authors":"A. Rodrigues, Mario Nagamura, Luis Gustavo Freire da Costa, M. Zuffo","doi":"10.1109/VR.2018.8446277","DOIUrl":"https://doi.org/10.1109/VR.2018.8446277","url":null,"abstract":"In this work, we present a hand-based natural interaction that allows performing fundamental actions such as moving or controlling objects and climbing ladders. The setup was restricted to available consumer VR technology, aiming to advance towards a practical unified framework for 3D interaction. The strategy was syncing the closest natural movement allowed by the device with primary task actions, either directly or indirectly, creating hypernatural UIs. The prototype allowed successful completion of the three challenges proposed by the 2018 3DUI Contest, as validated by a preliminary user study with participants from the target audience and also from the general public.","PeriodicalId":355048,"journal":{"name":"2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"32 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128668825","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Human Identification Using Neural Network-Based Classification of Periodic Behaviors in Virtual Reality","authors":"Duc-Minh Pham","doi":"10.1109/VR.2018.8446529","DOIUrl":"https://doi.org/10.1109/VR.2018.8446529","url":null,"abstract":"There are a lot of techniques that help computer systems or devices identify their users in order to not only protect privacy, personal information, and sensitive data but also provide appropriate treatments, advertisements, or benefits. With passcode, password, fingerprint, or iris, people need to explicitly do some required activities such as typing their codes, showing their eyes, and putting their fingers on the scanners. Those solutions should be used in high-secure scenarios such as executing banking transactions and unlocking personal phones. In other systems such as gaming machines and collaborative frameworks, which aim to prioritize user experience and convenience, it would be better if user profile can be collected and built implicitly. Among those systems, virtual reality (VR) is a new trend, a new platform supporting not only fully immersive experience for gamers but also a collaborative environment for students, researchers, and other people. Currently, VR systems can track user physical activities via trackable devices such as HMD and VR controllers. Therefore, we aim to use virtual reality as our identification equipment. In virtual reality, we can easily simulate an invariant condition at any time so that people have larger probability to replicate their behaviors without any external affections. Therefore, we want to investigate if we could classify VR users based on their periodic interaction with virtual objects. We collect the position and direction of user's head or hands when doing a task and build a classification model based on those data using convolutional neural network approach. We have done an experiment to explore the capability of our proposed technique. The result was motivated with the highest accuracy of 90.92%. Identification in VR hence is potentially applicable. In the future, we plan to do a large-scale experiment with a larger group of participants to examine the strength of our method.","PeriodicalId":355048,"journal":{"name":"2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"56 13","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"120935836","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Evaluation of Hand-Based Interaction for Near-Field Mixed Reality with Optical See-Through Head-Mounted Displays","authors":"Zhenliang Zhang, Benyang Cao, Dongdong Weng, Yue Liu, Yongtian Wang, Hua Huang","doi":"10.1109/VR.2018.8446129","DOIUrl":"https://doi.org/10.1109/VR.2018.8446129","url":null,"abstract":"Hand-based interaction is one of the most widely-used interaction modes in the applications based on optical see-through head-mounted displays (OST-HMDs). In this paper, such interaction modes as gesture-based interaction (GBI) and physics-based interaction (PBI) are developed to construct a mixed reality system to evaluate the advantages and disadvantages of different interaction modes for near-field mixed reality. The experimental results show that PBI leads to a better performance of users regarding their work efficiency in the proposed tasks. The statistical analysis of T-test has been adopted to prove that the difference of efficiency between different interaction modes is significant.","PeriodicalId":355048,"journal":{"name":"2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"70 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127159840","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Mediated Physicality: Inducing Illusory Physicality of a Virtual Human via Environmental Objects","authors":"Myungho Lee","doi":"10.1109/VR.2018.8446159","DOIUrl":"https://doi.org/10.1109/VR.2018.8446159","url":null,"abstract":"A physical embodiment of a virtual human has shown benefits in applications that involve social interaction with virtual humans. However, it often incorporates cumbersome haptic devices or robotic bodies. In this position paper, we first discuss our motivation for utilizing a surrounding environment in human-virtual human interaction and present our preliminary studies and results. Considering the previous studies and related literature, we define the concept of Mediated Physicality for virtual humans, which utilizes environmental objects to increase perceived physicality of the virtual humans, and discuss fundamental aspects of the Mediated Physicality as well as future research plans.","PeriodicalId":355048,"journal":{"name":"2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124194609","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Auto-Scaled Full Body Avatars for Virtual Reality: Facilitating Interactive Virtual Body Modification","authors":"Tuukka M. Takala, Heikki Heiskanen","doi":"10.1109/VR.2018.8446477","DOIUrl":"https://doi.org/10.1109/VR.2018.8446477","url":null,"abstract":"Virtual reality avatars and the illusion of virtual body ownership are increasingly attracting attention from researchers [1] [2]. As a continuation to our previous work with avatars [3], we updated our existing RUIS for Unity toolkit [4] with new capabilities that facilitate the creation of virtual reality applications with adaptive and customizable avatars.","PeriodicalId":355048,"journal":{"name":"2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"17 2","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134363377","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}