A. Bogdanovych, K. Moses, Bethany M. Wootton, Tomas Trescak
{"title":"Dealing with a Panic Attack: a Virtual Reality Training Module for Postgraduate Psychology Students","authors":"A. Bogdanovych, K. Moses, Bethany M. Wootton, Tomas Trescak","doi":"10.1145/3489849.3489926","DOIUrl":"https://doi.org/10.1145/3489849.3489926","url":null,"abstract":"In this paper we present a virtual reality training simulator for postgraduate psychology students. This simulator features an interaction between a clinical psychologist (student) and a patient (virtual agent) suffering from Obsessive Compulsive Disorder (OCD). Our simulation focuses on the form of OCD treatment called “Exposure Therapy”. The traditional way of learning how to perform Exposure Therapy (ET) currently involves watching video recordings and discussing those in the class. In our simulation we conduct an immersive exposure therapy session in VR. This session involves a live interaction with a patient that at one stage triggers a panic attack. Our hypothesis is that the immersive nature of the training session will affect the decision making process of the students so that they are more likely to cease the exposure task than those student participating in a less immersive form of learning (watching a video recording). We also hypothesise that participating in an immersive VR training session is more effective than watching videos, as far as information retention goes.","PeriodicalId":345527,"journal":{"name":"Proceedings of the 27th ACM Symposium on Virtual Reality Software and Technology","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126393251","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Double-Layered Cup-Shaped Device to Amplify Taste Sensation of Carbonation by the Electrical Stimulation on the Human Tongue","authors":"Ibuki Nomura, Noriki Mochizuki, Sousuke Nakamura, Takafumi Koike","doi":"10.1145/3489849.3489904","DOIUrl":"https://doi.org/10.1145/3489849.3489904","url":null,"abstract":"We show that electrical stimulation on the human tongue amplifies the taste sensation of carbonated beverages. We have developed a novel electric taste system with two components: a cup-shaped device and its circuit for stimulation. The cup-shaped device has a double-layer structure. The circuit has a constant current control circuit and a signal generator, which allow adjustment of the electrical parameters. The device is hygiene when we demonstrate electric taste because the device has two-layered. Thus we can change the inner layer that touches the user’s mouth. The device is also inexpensive and easy to manufacture so that many people can experience them.","PeriodicalId":345527,"journal":{"name":"Proceedings of the 27th ACM Symposium on Virtual Reality Software and Technology","volume":"104 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132218203","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"HoloKeys: Interactive Piano Education Using Augmented Reality and IoT","authors":"A. J. Stanbury, Ines Said, H. Kang","doi":"10.1145/3489849.3489921","DOIUrl":"https://doi.org/10.1145/3489849.3489921","url":null,"abstract":"The rise of online learning poses unique challenges in music education, where live demonstration and musical synchronization are critical for student success. We present HoloKeys, a music education interface which allows instructors to play remotely located pianos using an augmented reality headset and wifi-enabled microcontrollers. This approach allows students to receive distance education which is more direct, immersive, and comprehensive than conventional video conferencing allows for. HoloKeys enables remote students to observe live instructional demonstration on a physical keyboard in their immediate environment just as they would in traditional settings. HoloKeys consists of two separate components: an augmented reality user interface and a piano playing apparatus. Our system aims to extend online music education beyond desktop platforms into the physical world, thereby addressing crucial obstacles encountered by educators and students transitioning into online education.","PeriodicalId":345527,"journal":{"name":"Proceedings of the 27th ACM Symposium on Virtual Reality Software and Technology","volume":"29 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132984925","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Lucas Plabst, S. Oberdörfer, O. Happel, Florian Niebling
{"title":"Visualisation methods for patient monitoring in anaesthetic procedures using augmented reality","authors":"Lucas Plabst, S. Oberdörfer, O. Happel, Florian Niebling","doi":"10.1145/3489849.3489908","DOIUrl":"https://doi.org/10.1145/3489849.3489908","url":null,"abstract":"In health care, there are still many devices with poorly designed user interfaces that can lead to user errors. Especially in acute care, an error can lead to critical conditions in patients. Previous research has shown that the use of augmented reality can help to better monitor the condition of patients and better detect unforeseen events. The system created in this work is intended to aid in the detection of changes in patient and equipment-data in order to increase detection of critical conditions or errors.","PeriodicalId":345527,"journal":{"name":"Proceedings of the 27th ACM Symposium on Virtual Reality Software and Technology","volume":"17 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133033557","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
S. G. Wheeler, S. Hoermann, R. Lindeman, G. Ghinea, A. Covaci
{"title":"Content-rich and Expansive Virtual Environments Using Passive Props As World Anchors","authors":"S. G. Wheeler, S. Hoermann, R. Lindeman, G. Ghinea, A. Covaci","doi":"10.1145/3489849.3489947","DOIUrl":"https://doi.org/10.1145/3489849.3489947","url":null,"abstract":"In this paper, we present a system that allows developers to add passive haptic feedback into their virtual reality applications by making use of existing physical objects in the user’s real environment. Our approach has minimal dependence on procedural generation and does not limit the virtual space to the dimensions of the physical play-area.","PeriodicalId":345527,"journal":{"name":"Proceedings of the 27th ACM Symposium on Virtual Reality Software and Technology","volume":"83 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133399616","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Nilotpal Biswas, Debangshu Banerjee, S. Bhattacharya
{"title":"Natural walking speed prediction in Virtual Reality while using target selection-based locomotion","authors":"Nilotpal Biswas, Debangshu Banerjee, S. Bhattacharya","doi":"10.1145/3489849.3489944","DOIUrl":"https://doi.org/10.1145/3489849.3489944","url":null,"abstract":"Travelling speed plays an essential role in the overall user experience while navigating inside a virtual environment. Researchers have used various travelling speed that matches the user speed profile in order to give a natural walking experience. However, predicting a user’s instantaneous walking speed can be challenging when there is no continuous input from the user. Target selection-based techniques are those where the user selects the target to reach there automatically. These techniques also lack naturalness due to their low interaction fidelity. In this work, we have proposed a mathematical model that can dynamically compute the instantaneous natural walking speed while moving from one point to another in a virtual environment. We formulated our model with the help of user studies.","PeriodicalId":345527,"journal":{"name":"Proceedings of the 27th ACM Symposium on Virtual Reality Software and Technology","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133419129","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Fishtank Sandbox: A Software Framework for Collaborative Usability Testing of Fish Tank Virtual Reality Interaction Techniques","authors":"Vishal Jangid, Sirisilp Kongsilp","doi":"10.1145/3489849.3489915","DOIUrl":"https://doi.org/10.1145/3489849.3489915","url":null,"abstract":"Human-computer interaction researchers have been studying how we can interact with virtual objects in a virtual environment efficiently. Many usability experiments do not have the same control parameters. The lack of consistency makes comparing different interaction techniques difficult. In this article, we present a software framework for usability study in FTVR interaction techniques. The software framework provides fixed control parameters (e.g., task, graphic settings, and measuring parameters), the ability for other researchers to incorporate their interaction techniques as an add-on, and enabling individuals to participate in the experiment over the internet. The article explores a new way for VR/AR researchers to approach usability experiments using the framework and discuss the challenges that it brings.","PeriodicalId":345527,"journal":{"name":"Proceedings of the 27th ACM Symposium on Virtual Reality Software and Technology","volume":"19 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115472545","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"VRGaitAnalytics: Visualizing Dual Task Cost for VR Gait Assessment","authors":"Zhu Wang, Liraz Arie, Anat V. Lubetzky, K. Perlin","doi":"10.1145/3489849.3489874","DOIUrl":"https://doi.org/10.1145/3489849.3489874","url":null,"abstract":"Among its many promising applications, Virtual Reality (VR) can simulate diverse real-life scenarios and therefore help experimenters assess individuals’ gait performance (i.e., walking) under controlled functional contexts. VR-based gait assessment may provide low-risk, reproducible and controlled virtual environments, enabling experimenters to investigate underlying causes for imbalance by manipulating experimental conditions such as multi-sensory loads, mental processing loads (cognitive load), and/or motor tasks. We present a low-cost novel VR gait assessment system that simulates virtual obstacles, visual, auditory, and cognitive loads while using motion tracking to assess participants’ walking performance. The system utilizes in-situ spatial visualization for trial playback and instantaneous outcome measures which enable experimenters and participants to observe and interpret their performance. The trial playback can visualize any moment in the trial with embodied graphic segments including the head, waist, and feet. It can also replay two trials at the same time frame for trial-to-trial comparison, which helps visualize the impact of different experimental conditions. The outcome measures, i.e., the metrics related to walking performance, are calculated in real-time and displayed as data graphs in VR. The system can help experimenters get specific gait information on balance performance beyond a typical clinical gait test, making it clinically relevant and potentially applicable to gait rehabilitation. We conducted a feasibility study with physical therapy students, research graduate students, and licensed physical therapists. They evaluated the system and provided feedback on the outcome measures, the spatial visualizations, and the potential use of the system in the clinic. The study results indicate that the system was feasible for gait assessment, and the immediate spatial visualization features were seen as clinically relevant and useful. Limitations and considerations for future work are discussed.","PeriodicalId":345527,"journal":{"name":"Proceedings of the 27th ACM Symposium on Virtual Reality Software and Technology","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124170642","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Jingjing Kang, Shouxia Wang, Shuxia Wang, Weiping He
{"title":"Fluid3DGuides: A Technique for Structured 3D Drawing in VR","authors":"Jingjing Kang, Shouxia Wang, Shuxia Wang, Weiping He","doi":"10.1145/3489849.3489955","DOIUrl":"https://doi.org/10.1145/3489849.3489955","url":null,"abstract":"We propose Fluid3DGuides, a drawing guide technique to help users draw structured sketches more accurately in VR. The prototype system continuously infers visual guide lines for the user based on the user’s instant stroke drawing intention and its potential constraint relationship with the existing strokes. We evaluated our prototype through a pilot user study with six participants by comparing the proposed guide technique against the non-guide drawing condition. Participants gave positive comments on ease of use and drawing accuracy. They found that the technique could reduce the time and effort required to find the corrected drawing perspective and obtain more accurate 3D structured sketches.","PeriodicalId":345527,"journal":{"name":"Proceedings of the 27th ACM Symposium on Virtual Reality Software and Technology","volume":"265 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114326590","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Jonathan Liebers, Patrick Horn, Christian Burschik, Uwe Gruenefeld, Stefan Schneegass
{"title":"Using Gaze Behavior and Head Orientation for Implicit Identification in Virtual Reality","authors":"Jonathan Liebers, Patrick Horn, Christian Burschik, Uwe Gruenefeld, Stefan Schneegass","doi":"10.1145/3489849.3489880","DOIUrl":"https://doi.org/10.1145/3489849.3489880","url":null,"abstract":"Identifying users of a Virtual Reality (VR) headset provides designers of VR content with the opportunity to adapt the user interface, set user-specific preferences, or adjust the level of difficulty either for games or training applications. While most identification methods currently rely on explicit input, implicit user identification is less disruptive and does not impact the immersion of the users. In this work, we introduce a biometric identification system that employs the user’s gaze behavior as a unique, individual characteristic. In particular, we focus on the user’s gaze behavior and head orientation while following a moving stimulus. We verify our approach in a user study. A hybrid post-hoc analysis results in an identification accuracy of up to 75 % for an explainable machine learning algorithm and up to 100 % for a deep learning approach. We conclude with discussing application scenarios in which our approach can be used to implicitly identify users.","PeriodicalId":345527,"journal":{"name":"Proceedings of the 27th ACM Symposium on Virtual Reality Software and Technology","volume":"29 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115028899","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}