{"title":"Global Physical Prior Based Fluid Reconstruction for VR/AR","authors":"Qifan Zhang, Shibang Xiao, Yunchi Cen, Jingxuan Han, Xiaohui Liang","doi":"10.1109/VRW58643.2023.00254","DOIUrl":"https://doi.org/10.1109/VRW58643.2023.00254","url":null,"abstract":"Fluid is a common natural phenomenon and often appears in various VR/AR applications. Several works use sparse view images and integrate physical priors to improve reconstruction results. However, existing works only consider physical priors between adjacent frames. In our work, we propose a differentiable fluid simulator combined with a differentiable renderer for fluid reconstruction, which can make full use of global physical priors among long series. Furthermore, we introduce divergence-free Laplacian eigenfunctions as velocity bases to improve efficiency and save memory. We demonstrate our method on both synthetic and real data and show that it can produce better results.","PeriodicalId":412598,"journal":{"name":"2023 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"39 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114606051","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Put your glasses on: A voxel-based 3D authentication system in VR using eye-gaze","authors":"Rumeysa Türkmen, Chukwuemeka Nwagu, Prashant Rawat, Poppy Riddle, Kissinger Sunday, Mayra Donaji Barrera Machuca","doi":"10.1109/VRW58643.2023.00316","DOIUrl":"https://doi.org/10.1109/VRW58643.2023.00316","url":null,"abstract":"Due to the current push of social Virtual Reality (VR) apps and mobile VR headsets, users are surrounded by people in real life and virtually. Users need a private method to authenticate payments or login into apps. In this paper, we propose VoxAuth, a novel voxel-based 3D authentication system, allowing users to input their password in a private way. By using eye-gaze as a secure, input method, people outside VR are prevented from observing the pass-word. Sunglasses on the avatar appear during the authentication process both as a gaze observation prevention and as a signal that the user is still connected.","PeriodicalId":412598,"journal":{"name":"2023 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"23 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122044837","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"High Levels of Visibility of Virtual Agents Increase the Social Presence of Users","authors":"Lucie Kruse, Fariba Mostajeran, Frank Steinicke","doi":"10.1109/VRW58643.2023.00264","DOIUrl":"https://doi.org/10.1109/VRW58643.2023.00264","url":null,"abstract":"Virtual humanoid agents can perform a variety of tasks, and their representation has an influence on several psychological aspects. We compared the effects of agent visibility during an anagram solving task in virtual reality on social presence and cognitive task performance. We increased the visibility from (i) voice-only, (ii) mouth-only, (iii) full head, (iv) upper body, to (v) full body representations of the same virtual agent in a within-subject design study. The re-sults revealed significant differences in the perceived social presence, especially for the two least visible representations, i.e., voice- and mount -only, but no significant effects on task performance could be found.","PeriodicalId":412598,"journal":{"name":"2023 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"309 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122126208","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Jonathan W. Kelly, Stephen B Gilbert, M. Dorneich, Kristi A. Costabile
{"title":"Gender differences in cybersickness: Clarifying confusion and identifying paths forward","authors":"Jonathan W. Kelly, Stephen B Gilbert, M. Dorneich, Kristi A. Costabile","doi":"10.1109/VRW58643.2023.00067","DOIUrl":"https://doi.org/10.1109/VRW58643.2023.00067","url":null,"abstract":"Cybersickness is a barrier to widespread adoption of virtual reality (VR). We summarize the literature and conclude that women experience more cybersickness than do men, but that the size of the gender effect is modest. We present a mediation and moderation framework for organizing existing research and proposing new questions about gender and cybersickness. A mediator causally connects gender and cybersickness, and a moderator changes the magnitude of the gender difference in cybersickness.","PeriodicalId":412598,"journal":{"name":"2023 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"33 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124572701","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A Generalized Model for Non-Contact Gesture Interaction with Function Application Independence","authors":"Song Wang, Hao Hu, Hao Long, Liang Liu, Yonghui Chen, Yadong Wu","doi":"10.1109/VRW58643.2023.00077","DOIUrl":"https://doi.org/10.1109/VRW58643.2023.00077","url":null,"abstract":"Immersive Analytics(IA), has been broadly used as an emerging technology in fields including Flow Field, Network Visualization, Volume Visualization, etc. The interaction methods are categorized into handle device and non-contact gestures, etc. In non-contact gestures, the peculiar attributes of the domain data and the subjective factors of the developers cause some variation in gesture actions across data domains. Through our investigation, we believe that there exist generic features of gesture actions across domains, including rotate and move gestures bound by interaction paradigms. Therefore, we proposed a generic model of non-contact gesture interaction to ensure data domain independence; we anticipate that the model can be efficiently adapted to multiple types of data domains in IA. We presented a gesture comfort evaluation model with micro-expressions to develop a gesture action set, introduced a gesture interaction design based on WYTIWYG (What You Touch Is What You Get) concept, defined a generic gesture interaction mechanism to prevent functional dependency, and implemented a non-contact generic gesture action paradigm for immersive environments. Subsequently, we recruited 30 participants to evaluate the three cases and recorded their micro-expressions. Finally, the micro-expressions were analyzed and a questionnaire survey was performed; the outcomes of the micro-expressions assessment were further cross-validated with the gesture evaluation model to confirm the validity of the proposed general model.","PeriodicalId":412598,"journal":{"name":"2023 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"49 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129482661","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Rafael Maio, Bernardo Marques, Andreia Santos, Pedro Ramalho, Duarte Almeida, Paulo Dias, B. Santos
{"title":"Real-Time Data Monitoring of an Industry 4.0 Assembly Line using Pervasive Augmented Reality: First Impressions","authors":"Rafael Maio, Bernardo Marques, Andreia Santos, Pedro Ramalho, Duarte Almeida, Paulo Dias, B. Santos","doi":"10.1109/VRW58643.2023.00090","DOIUrl":"https://doi.org/10.1109/VRW58643.2023.00090","url":null,"abstract":"Industry 4.0 is the latest industrial revolution, tend to seamlessly integrate digital and physical worlds into industrial procedures in an effort to better support different stakeholders' needs. This work proposes a Pervasive Augmented Reality (AR) first prototype, developed in collaboration with industrial partners, to support real-time data monitoring and problem detection of an industrial assembly line. A Human-Centered Design (HCD) methodology was used to identify stakeholders' difficulties, needs, and challenges, as well as define requirements to guide the development of the prototype. The first impressions of these individuals are described, after an initial user study on the industrial shop floor, conducted to evaluate the proposed prototype, and collect feedback on how it can be improved to better support such contexts.","PeriodicalId":412598,"journal":{"name":"2023 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129798765","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Nana Tian, Khalil Haroun Achache, Ali Raed Ben Mustapha, R. Boulic
{"title":"EGG Objective characterization of cybersickness symptoms towards navigation axis","authors":"Nana Tian, Khalil Haroun Achache, Ali Raed Ben Mustapha, R. Boulic","doi":"10.1109/VRW58643.2023.00068","DOIUrl":"https://doi.org/10.1109/VRW58643.2023.00068","url":null,"abstract":"The current study sought to objectively evaluate cybersickness by utilizing Electrogastrogram (EGG) physiological data in relation to three different navigation axes: Translational movement along the longitudinal and lateral axes, and rotation along the vertical Yaw axis. EGG was employed as it has been clinically identified as a valuable tool for capturing dysfunction in the stomach. This resulted in a 2x2x2 factorial design. Results from both subjective and objective measurements $(mathrm{N}=26, mathrm{F}=10)$ indicate that rotation along the Yaw axis is the primary factor influencing cybersickness. Additionally, it was found that individuals who are not susceptible to cybersickness do not exhibit any dominant factors. In summary, through the analysis of EGG data, several key physi-ological indicators of individual susceptibility to cybersickness were identified. The findings suggest that there is a positive correlation between the mean dominant frequency and the tachygastria ratio with individual susceptibility, while the normalgastria and normal-tachy ratio were found to be negatively correlated with individual susceptibility","PeriodicalId":412598,"journal":{"name":"2023 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129897115","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"DC-“Born Again”: En/De-Roling, Character Identification, & VR Effects","authors":"Shane L Burrell","doi":"10.1109/VRW58643.2023.00329","DOIUrl":"https://doi.org/10.1109/VRW58643.2023.00329","url":null,"abstract":"With the rise of the metaverse, humans are increasingly spending time in mediated spaces, emphasizing the importance to understand how immersive experiences shape individual and collective behavior in various contexts (e.g., enterprise, entertainment). Two particularly important factors emerging in metaverse technologies relate to how users of virtual reality (VR) adapt and relinquish their character roles (en/de-roling), and the extent to which they identify with said roles.. Despite the prominent reliance of character roles in most VR experiences, and its noted influence on aspects of user experience (UX), little is understood about (a) how users adapt and relinquish their roles, and (b) how this process may shape affective and cognitive responses through altered identification. To address this gap, the current work employs a mixed-methods approach to understand how VR effects are influenced by these two factors.","PeriodicalId":412598,"journal":{"name":"2023 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"14 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126855124","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Tiger F. Ji, Yaxin Hu, Yu Huang, Ruofei Du, Yuhang Zhao
{"title":"A Preliminary Interview: Understanding XR Developers' Needs towards Open-Source Accessibility Support","authors":"Tiger F. Ji, Yaxin Hu, Yu Huang, Ruofei Du, Yuhang Zhao","doi":"10.1109/VRW58643.2023.00107","DOIUrl":"https://doi.org/10.1109/VRW58643.2023.00107","url":null,"abstract":"While extended reality (XR) technology is seeing increasing mainstream utilization, it is not accessible to users with disabilities and lacks support for XR developers to create accessibility features. In this study, we investigated XR developers' practices, challenges, needs when integrating accessibility in their projects. Our findings revealed developers' needs for open-source accessibility support, such as code examples of particular accessibility features alongside accessibility guidelines.","PeriodicalId":412598,"journal":{"name":"2023 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"38 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123341986","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Mixed Reality Human Teleoperation","authors":"David G. Black, S. Salcudean","doi":"10.1109/VRW58643.2023.00083","DOIUrl":"https://doi.org/10.1109/VRW58643.2023.00083","url":null,"abstract":"For many applications, remote guidance and telerobotics provide great advantages. For example, tele-ultrasound can bring much-needed expert healthcare to isolated communities. However, existing tele-guidance methods have serious limitations. A new concept called human teleoperation leverages mixed reality, haptics, and high-speed communication to provide tele-guidance that is more tightly coupled than existing methods yet more accessible than telerobotics. This paper provides an overview of the human teleoperation concept and its application to tele-ultrasound. The concept and its impact are discussed, the graphics, communications, controls, and haptics subsystems are explained, and results are presented that show the system's efficacy. These include tests of the communication architecture, of human performance in tracking mixed reality signals, and of human teleoperation in a limited clinical use-case. The results show good potential for teleultrasound, as well as possible other applications of human teleoperation including remote maintenance, inspection, and training.","PeriodicalId":412598,"journal":{"name":"2023 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"42 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123492697","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}