I. Arafat, Sharif Mohammad Shahnewaz Ferdous, J. Quarles
{"title":"The effects of cybersickness on persons with multiple sclerosis","authors":"I. Arafat, Sharif Mohammad Shahnewaz Ferdous, J. Quarles","doi":"10.1145/2993369.2993383","DOIUrl":"https://doi.org/10.1145/2993369.2993383","url":null,"abstract":"Cybersickness is commonly experienced by the users in immersive Virtual Environments (VE). It has symptoms similar to Motion Sickness, such as dizziness, nausea etc. Although there have been many cybersickness experiments conducted with persons without disabilities, persons with disabilities, such as Multiple Sclerosis (MS), have been minimally studied. This is an important area of research because cybersickness could have negative effects on virtual rehabilitation effectiveness and the accessibility of VEs. For this experiment, we recruited 16 participants - 8 persons with MS and 8 persons without MS from similar demographics (e.g. age, race). Two participants from population without MS could not complete the experiment due to severe cybersickness. We asked each participant to experience a VE. We collected Galvanic Skin response (GSR) data before and during VR exposure; GSR is commonly used as an objective measure of cybersickness. Also, Simulator Sickness Questionnaire (SSQ) feedback was recorded before and after the experiment. SSQ results show that the VE induced cybersickness in the participants. The GSR data suggests that the cybersickness may have induced similar physiological changes in participants with MS as participants without MS, albeit with greater variability in participants without MS. However, participants with MS had significantly lower GSR during VR exposure. In this paper, we compare the effects of cybersickness between the people with MS and the people without MS with respect to SSQ score and GSR data.","PeriodicalId":396801,"journal":{"name":"Proceedings of the 22nd ACM Conference on Virtual Reality Software and Technology","volume":"45 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-11-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115924196","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Interactive gamified 3D-training of affine transformations","authors":"S. Oberdörfer, Marc Erich Latoschik","doi":"10.1145/2993369.2996314","DOIUrl":"https://doi.org/10.1145/2993369.2996314","url":null,"abstract":"This article presents the Gamified Training Environment for Affine Transformations (GEtiT). GEtiT uses a 3D environment to visualize the effects of object rotation, translation, scaling, reflection, and shearing in 3D space. It encodes the abstract knowledge about homogeneous transformations and their order of application using specific game mechanics encoding 3D movements on different levels of abstraction. Progress in the game requires mastering of the game mechanics of a certain level of abstraction to modify objects in 3D space to a desired goal position and/or shape. Each level increases the abstraction of the representation towards a final 4 × 4 homogeneous matrix representation. Executing the game mechanics during the gameplay results in an effective training of knowledge due to a constant repetition. Evaluation showed a learning effect that is equal to a traditional training method while it achieved a higher enjoyment of use indicating that the learning quality was superior to the traditional training method.","PeriodicalId":396801,"journal":{"name":"Proceedings of the 22nd ACM Conference on Virtual Reality Software and Technology","volume":"36 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-11-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127771132","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Robot gardens: an augmented reality prototype for plant-robot biohybrid systems","authors":"S. Mammen, Heiko Hamann, Michael Heider","doi":"10.1145/2993369.2993400","DOIUrl":"https://doi.org/10.1145/2993369.2993400","url":null,"abstract":"Robot Gardens are an augmented reality concept allowing a human user to design a biohybrid, plant-robot system. Plants growing from deliberately placed seeds are directed by robotic units that the user can position, configure and activate. For example, the robotic units may serve as physical shields or frames but they may also guide the plants' growth through emission of light. The biohybrid system evolves over time to redefine architectural spaces. This gives rise to the particular challenge of designing a biohybrid system before its actual implementation and potentially long before its developmental processes unfold. Here, an augmented reality interface featuring according simulation models of plants and robotic units allows one to explore the design space a priori. In this work, we present our first functional augmented reality prototype to design biohybrid systems. We provide details about its workings and elaborate on first empirical studies on its usability.","PeriodicalId":396801,"journal":{"name":"Proceedings of the 22nd ACM Conference on Virtual Reality Software and Technology","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-11-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127653017","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
M. Geronazzo, Jacopo Fantin, Giacomo Sorato, Guido Baldovino, F. Avanzini
{"title":"Acoustic selfies for extraction of external ear features in mobile audio augmented reality","authors":"M. Geronazzo, Jacopo Fantin, Giacomo Sorato, Guido Baldovino, F. Avanzini","doi":"10.1145/2993369.2993376","DOIUrl":"https://doi.org/10.1145/2993369.2993376","url":null,"abstract":"Virtual and augmented realities are expected to become more and more important in everyday life in the next future; the role of spatial audio technologies over headphones will be pivotal for application scenarios which involve mobility. This paper introduces the SelfEar project, aimed at low-cost acquisition and personalization of Head-Related Transfer Functions (HRTFs) on mobile devices. This first version focuses on capturing individual spectral features which characterize external ear acoustics, through a self-adjustable procedure which guides users in collecting such information: their mobile device must be held with the stretched arm and positioned at several specific elevation points; acoustic data are acquired by an audio augmented reality headset which embeds a pair of microphones at listener ear-canals. A preliminary measurement session assesses the ability of the system to capture spectral features which are crucial for elevation perception. Moreover, a virtual experiment using a computational auditory model predicts clear vertical localization cues in the measured features.","PeriodicalId":396801,"journal":{"name":"Proceedings of the 22nd ACM Conference on Virtual Reality Software and Technology","volume":"12 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-11-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133656830","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Marc Erich Latoschik, Jean-Luc Lugrin, Michael Habel, D. Roth, Christian Seufert, Silke Grafe
{"title":"Breaking bad behavior: immersive training of class room management","authors":"Marc Erich Latoschik, Jean-Luc Lugrin, Michael Habel, D. Roth, Christian Seufert, Silke Grafe","doi":"10.1145/2993369.2996308","DOIUrl":"https://doi.org/10.1145/2993369.2996308","url":null,"abstract":"This article presents a fully immersive portable low-cost Virtual Reality system to train classroom management skills. An instructor controls the simulation of a virtual classroom populated with 24 semi-autonomous virtual agents via a desktop-based graphical user interface (GUI). The GUI provides behavior control and trainee evaluation widgets alongside a non-immersive view of the class and the trainee. The trainee's interface uses an Head-Mounted Display (HMD) and earphones for output. A depth camera and the HMD's built-in motion sensors are used for tracking the trainee and for avatar animation. An initial evaluation of both interfaces confirms the system's usefulness, specifically its capability to successfully simulate critical aspects of classroom management.","PeriodicalId":396801,"journal":{"name":"Proceedings of the 22nd ACM Conference on Virtual Reality Software and Technology","volume":"78 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-11-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134244028","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
D. Kharlamov, Brandon Woodard, Liudmila Tahai, Krzysztof Pietroszek
{"title":"TickTockRay: smartwatch-based 3D pointing for smartphone-based virtual reality","authors":"D. Kharlamov, Brandon Woodard, Liudmila Tahai, Krzysztof Pietroszek","doi":"10.1145/2993369.2996311","DOIUrl":"https://doi.org/10.1145/2993369.2996311","url":null,"abstract":"TickTockRay is a smartwatch-based raycasting technique designed for smartphone-based head mounted displays. It demonstrates that smartwatch-based raycasting can be reliably implemented on an off-the-shelf smartphone and may provide a feasible alternative for specialized input devices. We release TickTockRay to the research community as an open-source plugin for Unity along with an example application, a Minecraft VR game clone, that shows the utility of the technique for placement and destruction of Minecraft blocks.","PeriodicalId":396801,"journal":{"name":"Proceedings of the 22nd ACM Conference on Virtual Reality Software and Technology","volume":"2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-11-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132066114","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
A. Israr, Zachary Schwemler, John Mars, B. Krainer
{"title":"VR360HD: a VR360° player with enhanced haptic feedback","authors":"A. Israr, Zachary Schwemler, John Mars, B. Krainer","doi":"10.1145/2993369.2993404","DOIUrl":"https://doi.org/10.1145/2993369.2993404","url":null,"abstract":"We present a VR360° video player with haptic feedback playback. The VR360HD application enhances VR viewing experience by triggering customized haptic effects associated with user's activities, biofeedback, network messages and customizable timeline triggers incorporated in the VR media. The app is developed in the Unity3D game engine and tested using a GearVR headset, therefore allowing users to add animations to VR gameplay and to the VR360° streams. A custom haptic plugin allows users to author and associate animated haptic effects to the triggers, and playback these effects on a custom haptic hardware, the Haptic Chair. We show that the VR360HD app creates rich tactile effects and can be easily adapted to other media types.","PeriodicalId":396801,"journal":{"name":"Proceedings of the 22nd ACM Conference on Virtual Reality Software and Technology","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-11-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128696439","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Hojun Lee, Gyutae Ha, Sangho Lee, J. Cha, Shiho Kim
{"title":"A hologram based tele-existence platform for emotional exchange among a group of users in both real and virtual environments","authors":"Hojun Lee, Gyutae Ha, Sangho Lee, J. Cha, Shiho Kim","doi":"10.1145/2993369.2996312","DOIUrl":"https://doi.org/10.1145/2993369.2996312","url":null,"abstract":"We have proposed and implemented a hologram based tele-existence platform to provide user experience of emotional exchange among a group of participants in both real and virtual environments. The system provides not only immersive live scene to a user wearing VR HMD at a remote place, but also share and exchange emotional expressions with other users at other place by using a 360° camera and cloud server combined with a holographic technique. In the experiments, 6-different basic emotional expressions of user wearing VR HMD were transmitted, and replayed on the hologram display at the place where the group of participants were watching broadcasting of a sport through large screen TV.","PeriodicalId":396801,"journal":{"name":"Proceedings of the 22nd ACM Conference on Virtual Reality Software and Technology","volume":"29 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-11-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127364859","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
José Eduardo Venson, J. Berni, C. S. Maia, A. M. D. Silva, Marcos d'Ornelas, Anderson Maciel
{"title":"Medical imaging VR: can immersive 3D aid in diagnosis?","authors":"José Eduardo Venson, J. Berni, C. S. Maia, A. M. D. Silva, Marcos d'Ornelas, Anderson Maciel","doi":"10.1145/2993369.2996333","DOIUrl":"https://doi.org/10.1145/2993369.2996333","url":null,"abstract":"In the radiology diagnosis process, medical images are most often visualized slice by slice on 2D screens or printed. At the same time, the visualization based on 3D volumetric rendering of the data is considered useful and has increased its field of application. In this work we present a user study with medical specialists to assess the diagnostic effectiveness of VR usage in fracture identification over 3D volumetric reconstructions. We then performed user experiments to validate the approach in the medical practice. In addition, we assessed the subjects perception of the 3D reconstruction quality and ease of interaction. Among other results, we have found a very high level of effectiveness of the VR interface in identifying superficial fractures on head CTs.","PeriodicalId":396801,"journal":{"name":"Proceedings of the 22nd ACM Conference on Virtual Reality Software and Technology","volume":"3 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-11-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126083256","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Zeyu Wang, Xiaohan Jin, Fei Xue, R. Li, H. Zha, K. Ikeuchi
{"title":"Perceptual enhancement for stereoscopic videos based on horopter consistency","authors":"Zeyu Wang, Xiaohan Jin, Fei Xue, R. Li, H. Zha, K. Ikeuchi","doi":"10.1145/2993369.2993393","DOIUrl":"https://doi.org/10.1145/2993369.2993393","url":null,"abstract":"Audience discomfort, such as eye strain and dizziness, is one of the urgent issues that virtual reality and 3D movie technologies should tackle. Except for inappropriate horizontal and vertical disparity, one major problem is that people's binocular vergence and focal length in the cinema remain inconsistent from normal visual habits. Psychologists discovered the horopter and Panum's fusional area to describe zero-disparity points projected on the retinas based on accommodation-convergence consistency. In this paper, inspired by these concepts, we propose a stereoscopic effect correction system for perceptual enhancement according to fixated region and scene information. As a preprocessing step, tracking and stereo matching algorithms are implemented to prepare cues for further transformation in 3D space. Then in order to accomplish certain visual effects, we describe a geometric framework for disparity refinement and image warping based on parameter adjustment of the virtual stereoscopic rig. For evaluation, subjective experiments have been conducted to prove the effectiveness of our method. Therefore, our work provides a possibility to improve the audience experience from a formerly underexplored perspective.","PeriodicalId":396801,"journal":{"name":"Proceedings of the 22nd ACM Conference on Virtual Reality Software and Technology","volume":"16 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-11-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124479469","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}