Jonas Auda, Nils Verheyen, Sven Mayer, Stefan Schneegass
{"title":"Flyables: Haptic Input Devices for Virtual Reality using Quadcopters","authors":"Jonas Auda, Nils Verheyen, Sven Mayer, Stefan Schneegass","doi":"10.1145/3489849.3489855","DOIUrl":"https://doi.org/10.1145/3489849.3489855","url":null,"abstract":"Virtual Reality (VR) has made its way into everyday life. While VR delivers an ever-increasing level of immersion, controls and their haptics are still limited. Current VR headsets come with dedicated controllers that are used to control every virtual interface element. However, the controller input mostly differs from the virtual interface. This reduces immersion. To provide a more realistic input, we present Flyables, a toolkit that provides matching haptics for virtual user interface elements using quadcopters. We took five common virtual UI elements and built their physical counterparts. We attached them to quadcopters to deliver on-demand haptic feedback. In a user study, we compared Flyables to controller-based VR input. While controllers still outperform Flyables in terms of precision and task completion time, we found that Flyables present a more natural and playful way to interact with VR environments. Based on the results from the study, we outline research challenges that could improve interaction with Flyables in the future.","PeriodicalId":345527,"journal":{"name":"Proceedings of the 27th ACM Symposium on Virtual Reality Software and Technology","volume":"53 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122516379","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Farshid Salemi Parizi, W. Kienzle, Eric Whitmire, Aakar Gupta, Hrvoje Benko
{"title":"RotoWrist: Continuous Infrared Wrist Angle Tracking using a Wristband","authors":"Farshid Salemi Parizi, W. Kienzle, Eric Whitmire, Aakar Gupta, Hrvoje Benko","doi":"10.1145/3489849.3489886","DOIUrl":"https://doi.org/10.1145/3489849.3489886","url":null,"abstract":"We introduce RotoWrist, an infrared (IR) light based solution for continuously and reliably tracking 2-degree-of-freedom (DoF) relative angle of the wrist with respect to the forearm using a wristband. The tracking system consists of eight time-of-flight (ToF) IR light modules distributed around a wristband. We developed a computationally simple tracking approach to reconstruct the orientation of the wrist without any runtime training, ensuring user independence. An evaluation study demonstrated that RotoWrist achieves a cross-user median tracking error of 5.9° in flexion/extension and 6.8° in radial and ulnar deviation with no calibration required as measured with optical ground truth. We further demonstrate the performance of RotoWrist for a pointing task and compare it against ground truth tracking.","PeriodicalId":345527,"journal":{"name":"Proceedings of the 27th ACM Symposium on Virtual Reality Software and Technology","volume":"83 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123948674","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Kota Arai, Mone Konno, Yutaro Hirao, Shigeo Yoshida, Takuji Narumi
{"title":"Effect of Visual Feedback on Understanding Timbre with Shapes Based on Crossmodal Correspondences","authors":"Kota Arai, Mone Konno, Yutaro Hirao, Shigeo Yoshida, Takuji Narumi","doi":"10.1145/3489849.3489912","DOIUrl":"https://doi.org/10.1145/3489849.3489912","url":null,"abstract":"Timbre is a crucial element in playing musical instruments, and it is difficult for beginners to learn it independently. Therefore, external feedback (FB) is required. However, conventional FB methods lack intuitiveness in visualization. In this study, we propose a novel FB method that adopts crossmodal correspondence to enhance the intuitive visualization of timbre with visual shapes. Based on the experiments, it was inferred that the FB based on crossmodal correspondence prevents dependence on FB and promotes learning.","PeriodicalId":345527,"journal":{"name":"Proceedings of the 27th ACM Symposium on Virtual Reality Software and Technology","volume":"69 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133987022","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Multi-Componential Analysis of Emotions Using Virtual Reality","authors":"R. Somarathna, T. Bednarz, Gelareh Mohammadi","doi":"10.1145/3489849.3489958","DOIUrl":"https://doi.org/10.1145/3489849.3489958","url":null,"abstract":"In this study, we propose our data-driven approach to investigate the emotional experience triggered using Virtual Reality (VR) games. We considered a full Component Process Model (CPM) which theorise emotional experience as a multi-process phenomenon. We validated the possibility of the proposed approach through a pilot experiment and confirmed that VR games can be used to trigger a diverse range of emotions. Using hierarchical clustering, we showed a clear distinction between positive and negative emotion in the CPM space.","PeriodicalId":345527,"journal":{"name":"Proceedings of the 27th ACM Symposium on Virtual Reality Software and Technology","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132078370","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Interactive Visualization of Deep Learning Models in an Immersive Environment","authors":"Hikaru Nagasaka, Motoya Izuhara","doi":"10.1145/3489849.3489956","DOIUrl":"https://doi.org/10.1145/3489849.3489956","url":null,"abstract":"The development of deep learning (DL) models has been prevalent among software engineers. However, it is difficult for non-experts to analyze and understand their behavior. Hence, we propose an interactive visualization system of DL models in an immersive environment. Because an immersive environment offers unlimited displays and visualization of high-dimensional data, it enables a comprehensive analysis on data propagations through the layers, and compares the multiple performance metrics. In this research, we implemented a prototype system, demonstrated it to machine learning engineers, and discussed the future benefits of visualizing DL models in an immersive environment. Accordingly, our concept received positive feedback; however, we inferred that most of the engineers consider the visualization technology as a unique introduction to the immersive environment.","PeriodicalId":345527,"journal":{"name":"Proceedings of the 27th ACM Symposium on Virtual Reality Software and Technology","volume":"95 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133196706","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Florian Kern, Thore Keser, Florian Niebling, Marc Erich Latoschik
{"title":"Using Hand Tracking and Voice Commands to Physically Align Virtual Surfaces in AR for Handwriting and Sketching with HoloLens 2","authors":"Florian Kern, Thore Keser, Florian Niebling, Marc Erich Latoschik","doi":"10.1145/3489849.3489940","DOIUrl":"https://doi.org/10.1145/3489849.3489940","url":null,"abstract":"In this paper, we adapt an existing VR framework for handwriting and sketching on physically aligned virtual surfaces to AR environments using the Microsoft HoloLens 2. We demonstrate a multimodal input metaphor to control the framework’s calibration features using hand tracking and voice commands. Our technical evaluation of fingertip/surface accuracy and precision on physical tables and walls is in line with existing measurements on comparable hardware, albeit considerably lower compared to previous work using controller-based VR devices. We discuss design considerations and the benefits of our unified input metaphor suitable for controller tracking and hand tracking systems. We encourage extensions and replication by providing a publicly available reference implementation (https://go.uniwue.de/hci-otss-hololens).","PeriodicalId":345527,"journal":{"name":"Proceedings of the 27th ACM Symposium on Virtual Reality Software and Technology","volume":"90 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116282121","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Object Manipulations in VR Show Task- and Object-Dependent Modulation of Motor Patterns","authors":"Jaime Maldonado, C. Zetzsche","doi":"10.1145/3489849.3489858","DOIUrl":"https://doi.org/10.1145/3489849.3489858","url":null,"abstract":"Humans can perform object manipulations in VR in spite of missing haptic and acoustic information. Whether their movements under these artificial conditions do still rely on motor programs based on natural experience or are impoverished due to the restrictions imposed by VR is unclear. We investigated whether reach-to-place and reach-to-grasp movements in VR can still be adapted to the task and to the specific properties of the objects being handled, or whether they reflect a stereotypic, task- and object-independent motor program. We analyzed reach-to-grasp and reach-to-place movements from participants performing an unconstrained ”set-the-table” task involving a variety of different objects in virtual reality. These actions were compared based on their kinematic features. We encountered significant differences in peak speed and the duration of the deceleration phase which are modulated depending on the action and on the manipulated object. The flexibility of natural human sensorimotor control thus is at least partially transferred and exploited in impoverished VR conditions. We discuss possible explanations of this behavior and the implications for the design of object manipulations in VR.","PeriodicalId":345527,"journal":{"name":"Proceedings of the 27th ACM Symposium on Virtual Reality Software and Technology","volume":"61 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124558610","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"The Effect of 2D Stylized Visualization of the Real World for Obstacle Avoidance and Safety in Virtual Reality System Usage","authors":"Jaeeun Kim, Heeyoon Jeong, G. Kim","doi":"10.1145/3489849.3489943","DOIUrl":"https://doi.org/10.1145/3489849.3489943","url":null,"abstract":"Using virtual reality systems with the head-mounted display can incur interaction difficulties and safety problems because of the user’s view being isolated from the real world operating space. One possible solution is to super-impose the real world objects or environment information onto the virtual scene. A variety of such visualization methods have been proposed, all in hopes of minimizing the negative effects of introducing foreign elements to the original virtual scene. In this poster, we propose to apply the neural style transfer technique to blend in the real world operating environment in the style of the given virtual space to make the super-imposed resulting image as natural as possible, maintaining the sense of immersion with the least level of distraction. Our pilot experimental study has shown that the stylization obscured the clear presentation of the environment and worsened or did not improve the safe user performance, and was neither considered sufficiently natural.","PeriodicalId":345527,"journal":{"name":"Proceedings of the 27th ACM Symposium on Virtual Reality Software and Technology","volume":"68 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121132499","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Estimate the Difference Threshold for Curvature Gain of Redirected Walking","authors":"Chang-Gyu Lee, O. Kwon, D. Kang","doi":"10.1145/3489849.3489942","DOIUrl":"https://doi.org/10.1145/3489849.3489942","url":null,"abstract":"Redirected walking (RDW) allows users to navigate a large virtual world in a small physical space. At this time, if the applied redirection is below the detection threshold, the human hardly notice. However, some papers reported that users perceived changes in curvature gain even when redirections smaller than the detection threshold were applied. This means that the change in curvature gain caused human perception. Therefore, in this paper, we identified a threshold for the change in curvature gain, which was found to be 3.06°/m. Further experiments using different variation methods for variations in curvature gain will follow.","PeriodicalId":345527,"journal":{"name":"Proceedings of the 27th ACM Symposium on Virtual Reality Software and Technology","volume":"22 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117058778","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Kento Sugimori, Hironori Mitake, Hirohito Sato, Kensho Oguri, S. Hasegawa
{"title":"Avatar Tracking Control with Generations of Physically Natural Responses on Contact to Reduce Performers’ Loads","authors":"Kento Sugimori, Hironori Mitake, Hirohito Sato, Kensho Oguri, S. Hasegawa","doi":"10.1145/3489849.3489859","DOIUrl":"https://doi.org/10.1145/3489849.3489859","url":null,"abstract":"The real-time performance of motion-captured avatars in virtual space is becoming increasingly popular, especially within applications including social virtual realities (VRs), virtual performers (e.g., virtual YouTubers), and VR games. Such applications often include contact between multiple avatars or between avatars and objects as communication or gameplay. However, most current applications do not solve the effects of contact for avatars, causing penetration or unnatural behavior to occur. In reality, no contact with the player’s body occurs; nevertheless, the player must perform as if contact occurred. While physics simulation can solve the contact issue, the naive use of physics simulation causes tracking delay. We propose a novel avatar tracking controller with feedforward control. Our method enables quick, accurate tracking and flexible motion in response to contacts. Furthermore, the technique frees avatar performers from the loads of performing as if contact occurred. We implemented our method and experimentally evaluated the naturalness of the resulting motions and our approach’s effectiveness in reducing performers’ loads.","PeriodicalId":345527,"journal":{"name":"Proceedings of the 27th ACM Symposium on Virtual Reality Software and Technology","volume":"21 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126363747","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}