Vuthea Chheang, Vikram Apilla, P. Saalfeld, C. Boedecker, T. Huber, F. Huettl, H. Lang, B. Preim, C. Hansen
{"title":"Collaborative VR for Liver Surgery Planning using Wearable Data Gloves: An Interactive Demonstration","authors":"Vuthea Chheang, Vikram Apilla, P. Saalfeld, C. Boedecker, T. Huber, F. Huettl, H. Lang, B. Preim, C. Hansen","doi":"10.1109/VRW52623.2021.00268","DOIUrl":"https://doi.org/10.1109/VRW52623.2021.00268","url":null,"abstract":"Preoperative planning for liver surgery is a critical procedure to assess a potential resection and it supports surgeons to define the affected vessels and resection volume. Traditional surgical planning systems are widely used to support the planning with the usage of desktop-based 3D and 2D visualizations. However, desktop-based systems offer limited interactions and visualizations compared to virtual reality (VR) [3]. A suitable technique to support collaboration among surgeons is required. Our previous works [1], [2] illustrate that using collaborative VR is essential to enhance communication, teamwork, and over-distance collaboration. In this work, we present a collaborative VR prototype to support liver surgery planning with intuitive interactions using wearable data gloves and VR controllers. The users can explore the patient data in both 2D and 3D representations. Thereafter, a virtual resection is specified by drawing lines on the 3D model representation. The virtual re-section is further refined to keep safety margins from the tumors. Moreover, real-time risk maps are visualized to support the surgeons during the modification. Finally, the results of the virtual resection are visualized as resection volumes with their indicated amount and colors. Future work aims to conduct an extensive clinical study to compare the suitability of these input devices.","PeriodicalId":256204,"journal":{"name":"2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"17 5","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"120985274","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Soumyajit Chakraborty, Jeanine K. Stefanucci, Sarah H. Creem-Regehr, Bobby Bodenheimer
{"title":"Distance Estimation with Mobile Augmented Reality in Action Space: Effects of Animated Cues","authors":"Soumyajit Chakraborty, Jeanine K. Stefanucci, Sarah H. Creem-Regehr, Bobby Bodenheimer","doi":"10.1109/VRW52623.2021.00034","DOIUrl":"https://doi.org/10.1109/VRW52623.2021.00034","url":null,"abstract":"Augmented reality is standard on many modern smartphone platforms. The distance of virtual objects seen through the smartphone display should be perceived accurately for easy interaction with virtual objects with high fidelity. We investigate whether distance perception through mobile augmented reality devices is affected by an animated avatar. The avatar walks and is positioned from near to far action space. We conduct a distributed experiment \"in the wild\" to investigate these effects.","PeriodicalId":256204,"journal":{"name":"2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"70 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115024709","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Christian Eichhorn, Martin Lurz, D. A. Plecher, Sandro Weber, Monika Wintergerst, B. Kaiser, S. Holzmann, C. Holzapfel, H. Hauner, K. Gedrich, Georg Groh, M. Böhm, H. Krcmar, G. Klinker
{"title":"Inspiring healthy Food Choices in a Virtual Reality Supermarket by adding a tangible Dimension in the Form of an Augmented Virtuality Smartphone","authors":"Christian Eichhorn, Martin Lurz, D. A. Plecher, Sandro Weber, Monika Wintergerst, B. Kaiser, S. Holzmann, C. Holzapfel, H. Hauner, K. Gedrich, Georg Groh, M. Böhm, H. Krcmar, G. Klinker","doi":"10.1109/VRW52623.2021.00156","DOIUrl":"https://doi.org/10.1109/VRW52623.2021.00156","url":null,"abstract":"We want to understand the changing shopping behaviour, influenced by health-targeting nutrition apps on mobile devices. To achieve that, we have built a realistic Virtual Reality (VR) supermarket simulation and addressed core aspects such as handling virtual products, prevention of cybersickness, rounded off with an initial pilot study. On top of that, we built a virtual replica smartphone in VR with nutritionrelated functionality. This has been extended with an Augmented Virtuality (AV) feature, that enables us to track the screen of a participant’s own smartphone, hence allowing us to integrate real-world apps and letting the user interact with them during the simulation. To achieve a high-quality tracking, we propose a hybrid approach, utilizing an add-on RGB camera on the VR headset fused with data provided through a WLAN connection in the case of self-made apps. This enables the users to manipulate the simulation from within the smartphone app, introducing a versatile, highly usability-centered controller because they can handle their phone naturally.","PeriodicalId":256204,"journal":{"name":"2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"57 6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116384076","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Ashu Adhikari, Daniel Zielasko, Alexander Bretin, Markus von der Heyde, E. Kruijff, B. Riecke
{"title":"Integrating Continuous and Teleporting VR Locomotion into a Seamless \"HyperJump\" Paradigm","authors":"Ashu Adhikari, Daniel Zielasko, Alexander Bretin, Markus von der Heyde, E. Kruijff, B. Riecke","doi":"10.1109/VRW52623.2021.00074","DOIUrl":"https://doi.org/10.1109/VRW52623.2021.00074","url":null,"abstract":"Virtual reality comes with many promises. It allows us to explore the world in ways that were not possible before, like flying untethered or teleporting, experiencing a dream-like state while fully awake, or having a world that responds to our every whim through the blinking of our eyes or gesture of our hands. To fulfill many potentials it bears, it should allow users to navigate freely through the virtual world they create.","PeriodicalId":256204,"journal":{"name":"2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"15 4 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122411253","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Sahil Athrij, A. Santhosh, R. Jayashankar, Arun Padmanabhan, S. Mathew
{"title":"Protocol for Dynamic Load Distribution in Web-Based AR","authors":"Sahil Athrij, A. Santhosh, R. Jayashankar, Arun Padmanabhan, S. Mathew","doi":"10.1109/VRW52623.2021.00093","DOIUrl":"https://doi.org/10.1109/VRW52623.2021.00093","url":null,"abstract":"In a Web-based Augmented Reality (AR) application, to achieve an immersive experience we require precise object detection, realistic model rendering, and smooth occlusion in real-time. To achieve these objectives on a device requires heavy computation capabilities unavailable on most mobile devices, this can be solved by using cloud computing but it introduces network latency issues. In this work, we propose a new network protocol named DLDAR (Dynamic Load Distribution in Web-based AR) that facilitates and standardizes methods for dynamic division of compute between client and server based on device and network condition to min-max latency and quality.","PeriodicalId":256204,"journal":{"name":"2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"20 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122780018","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Aurélie Congès, Peipei Yang, F. Bénaben, Jacob Graham
{"title":"Using Virtual Reality to Facilitate Common Operational Pictures’ Representation","authors":"Aurélie Congès, Peipei Yang, F. Bénaben, Jacob Graham","doi":"10.1109/VRW52623.2021.00067","DOIUrl":"https://doi.org/10.1109/VRW52623.2021.00067","url":null,"abstract":"During crisis, different organizations are involved, each with their jargon and communication devices. Information about the situation is needed by all and it is vital to be able to share the data. To centralize that data, Common Operational Pictures are implemented, but they do not remove the risk of information overload. We propose to use virtual reality to create a virtual environment improving COP representations, thus improving situational awareness and collaborative decision making.","PeriodicalId":256204,"journal":{"name":"2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"53 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114580378","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Danhua Zhang, M. Khadar, B. Schumacher, Madhava Raveendra, Sam Adeniyi, Fei Wu, Sahar A. Aseeri, Evan Suma Rosenberg
{"title":"COVID-Vision: A Virtual Reality Experience to Encourage Mindfulness of Social Distancing in Public Spaces","authors":"Danhua Zhang, M. Khadar, B. Schumacher, Madhava Raveendra, Sam Adeniyi, Fei Wu, Sahar A. Aseeri, Evan Suma Rosenberg","doi":"10.1109/VRW52623.2021.00231","DOIUrl":"https://doi.org/10.1109/VRW52623.2021.00231","url":null,"abstract":"Social distancing is currently the most effective known countermeasure against the rapid proliferation of the virus that causes COVID-19. This project aims to encourage mindfulness about maintaining interpersonal distance in shared public spaces through a multi-user virtual reality experience that simulates shopping in a grocery store. The virtual environment is populated with non-player characters that navigate through the store and also supports up to 20 concurrent live users represented as avatars. Real-time feedback is implemented using a dynamic visual effect that reacts to physical proximity, and comparative performance metrics are also provided for users to reflect on after the task is completed.","PeriodicalId":256204,"journal":{"name":"2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"1836 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129808202","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Yuan Li, Sang Won Lee, D. Bowman, David Hicks, W. Lages, Akshay Sharma
{"title":"ARCritique: Supporting Remote Design Critique of Physical Artifacts through Collaborative Augmented Reality","authors":"Yuan Li, Sang Won Lee, D. Bowman, David Hicks, W. Lages, Akshay Sharma","doi":"10.1145/3565970.3567700","DOIUrl":"https://doi.org/10.1145/3565970.3567700","url":null,"abstract":"Design critique sessions require students and instructors to jointly view and discuss physical artifacts. However, in remote learning scenarios, available tools (such as videoconferencing) are insufficient due to ineffective, inefficient communication of spatial information. This paper presents ARCritique, a mobile augmented reality application that combines KinectFusion and ARKit to allow users to 1) scan artifacts and share the resulting 3D models, 2) view the model simultaneously in a shared virtual environment from remote physical locations, and 3) point to and draw on the model to aid communication. A preliminary evaluation of ARCritique revealed great potential for supporting remote design education.","PeriodicalId":256204,"journal":{"name":"2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130144571","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"LighterBody: RNN based Anticipated Virtual Body Makes You Feel Lighter","authors":"Tatsuya Kure, Shunichi Kasahara","doi":"10.1109/VRW52623.2021.00163","DOIUrl":"https://doi.org/10.1109/VRW52623.2021.00163","url":null,"abstract":"The virtual body representation had shown the potential of intervention into the sense of body. To investigate how the temporal shift of body representation affects the user’s kinetic sensation, we developed a system to anticipate body movement with RNN. We then conducted a user study to assess the effect with the anticipated body movement and the system baseline. Results revealed that the transition from the baseline to the anticipated body induced a lighter bodyweight feeling, and the opposite transition induced a heavier feeling. Our work enlightens the potential of interactive manipulation of the full-body kinetic sensation using virtual body representation.","PeriodicalId":256204,"journal":{"name":"2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"131 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129142476","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Communications in Virtual Environment Improve Interpersonal Impression","authors":"Yuki Kato, M. Sugimoto, M. Inami, M. Kitazaki","doi":"10.1109/VRW52623.2021.00189","DOIUrl":"https://doi.org/10.1109/VRW52623.2021.00189","url":null,"abstract":"Pseudo physical touch is used for communications in virtual environments such as VRChat. We aimed to test if the pseudo-touch communication affects social impression in a virtual environment. Nineteen participants performed the controlled experiment with a partner who was an experimenter with three types of communications: no touch, pseudo touch, and actual touch. Subjective ratings of attractiveness and the communication easiness with the partner increased in all conditions, suggesting that the communication in virtual environments improves interpersonal attraction and communicability either with or without physical or pseudo touch.","PeriodicalId":256204,"journal":{"name":"2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"49 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114212669","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}