Shuhao Zhang, Yue Li, K. Man, Yong Yue, Jeremy S. Smith
{"title":"Towards Cross-Reality Interaction and Collaboration: A Comparative Study of Object Selection and Manipulation in Reality and Virtuality","authors":"Shuhao Zhang, Yue Li, K. Man, Yong Yue, Jeremy S. Smith","doi":"10.1109/VRW58643.2023.00075","DOIUrl":"https://doi.org/10.1109/VRW58643.2023.00075","url":null,"abstract":"Cross-Reality (CR) is an important topic for the research of multiuser collaborative systems. It allows users to participate in the reality-virtuality continuum and select appropriate interactive systems to work with, such as Virtual Reality Head-Mounted Displays (VR HMDs). However, there is limited work showing how interaction in VR differs from the more commonly used Personal Computers (PCs) and tablet devices in terms of object selection and manipulation. In this paper, we present a comparative study that investigated how users perform and perceive workload on 3D object selection and manipulation tasks using different devices (e.g. PC, tablet, and VR). We recorded the time and accuracy as objective task performance measures, and users' self-reported workload as a subjective measure. Our results revealed that unlike the biased performances of PC and tablet, VR has a balanced performance and great potentials in complex tasks.","PeriodicalId":412598,"journal":{"name":"2023 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115412782","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Applications of Interactive Style Transfer to Virtual Gallery","authors":"Xin-Han Wu, Hsin-Ju Chien, Y. Hung, Y. Huang","doi":"10.1109/VRW58643.2023.00280","DOIUrl":"https://doi.org/10.1109/VRW58643.2023.00280","url":null,"abstract":"Since the outbreak of the epidemic in recent years, many events have been held online to avoid spreading illness, such as the virtual gallery. Therefore, we propose an interactive style transfer virtual gallery system. Users can perform style transfer on 2D objects and 3D objects, where 2D objects include pictures and paper jams, and 3D objects include frames and 3D artworks. The results show that it not only makes the viewing process more interesting but also deepens the impression of the artwork through the process of choosing different styles.","PeriodicalId":412598,"journal":{"name":"2023 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"114 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123367230","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Designing Navigation Tool for Immersive Analytics in AR","authors":"Xiaoyan Zhou","doi":"10.1109/VRW58643.2023.00330","DOIUrl":"https://doi.org/10.1109/VRW58643.2023.00330","url":null,"abstract":"As augmented and virtual reality technologies continue to mature, many studies have looked into navigation in 3D visualizations in recent years. However, most studies that examine navigation in 3D visualizations were conducted using a virtual reality environment (VR). Navigation in augmented reality (AR) may have different needs and challenges which have remained undiscovered. My research aims to investigate the elements that influence navigation performance in AR and how to improve the efficiency and user experience of navigation in AR by design, development, and user studies of a navigation tool EaseNav that supports users' navigation procedure.","PeriodicalId":412598,"journal":{"name":"2023 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"72 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126167416","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"FluidPlaying: Efficient Adaptive Simulation for Highly Dynamic Fluid","authors":"Sinuo Liu, Xiaojuan Ban, Sheng Li, Haokai Zeng, Xiaokun Wang, Yanrui Xu, Fei Zhu, Guoping Wang","doi":"10.1109/VRW58643.2023.00258","DOIUrl":"https://doi.org/10.1109/VRW58643.2023.00258","url":null,"abstract":"We present FliudPlaying, a novel dynamic level-based spatially adaptive simulation method that can handle highly dynamic fluid efficiently. To capture the subtle detail of the fluid surface, the high-resolution simulation is performed not only at the free surface but also at those regions with high vorticity levels and velocity difference levels. To minimize the density error, an online optimization scheme is used when increasing the resolution by particle splitting. We also proposed a neighbor-based splash enhancement to compensate for the loss of dynamic details. Compared with the high-resolution simulation baseline, our method can achieve over 3× speedups while consuming only less than 10% computational resources. Furthermore, our method can make up for the loss of high-frequency details caused by the spatial adaptation, and provide more realistic dynamics in particle-based fluid simulation.","PeriodicalId":412598,"journal":{"name":"2023 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"23 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128251125","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Vitalizing cultural memory with immersive data storytelling","authors":"Yongning Zhu, Mengyue Liu, Zeru Lou, Rongyu Li, ZhongFu Tie, Wei Huang, Qingyun Diao","doi":"10.1109/VRW58643.2023.00101","DOIUrl":"https://doi.org/10.1109/VRW58643.2023.00101","url":null,"abstract":"Through decades of cultural heritage and cultural memory digitization, a massive amount of data has been collected and organized. There is an increasing interest in and demand for making the data approachable, explorable, and usable to the public audience, vi-sualizing the data with multiple scales, and expressing humanity knowledge with aesthetically appealing images.","PeriodicalId":412598,"journal":{"name":"2023 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"25 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128483064","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Taylor A. Doty, Jonathan W. Kelly, M. Dorneich, Stephen B Gilbert
{"title":"Does interpupillary distance (IPD) relate to immediate cybersickness?","authors":"Taylor A. Doty, Jonathan W. Kelly, M. Dorneich, Stephen B Gilbert","doi":"10.1109/VRW58643.2023.00173","DOIUrl":"https://doi.org/10.1109/VRW58643.2023.00173","url":null,"abstract":"Widespread adoption of virtual reality (VR) will likely be limited by the common occurrence of cybersickness. Cybersickness sus- ceptibility varies across individuals, and previous research reported that interpupillary distance (IPD) may be a factor. However, that work emphasized cybersickness recovery rather than cybersickness immediately after exposure. The current study (N=178) examined if the mismatch between the user's IPD and the VR headset's IPD setting contributes to immediate cybersickness. Multiple linear re-gression indicated that gender and prior sickness due to screens were significant predictors of immediate cybersickness. However, no significant relationship between IPD mismatch and immediate cybersickness was observed.","PeriodicalId":412598,"journal":{"name":"2023 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"45 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125074881","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Sangjun Eom, S. Kim, Yihang Jiang, Ryan Jay Chen, Ali R. Roghanizad, M. Rosenthal, J. Dunn, M. Gorlatova
{"title":"Investigation of Thermal Perception and Emotional Response in Augmented Reality using Digital Biomarkers: A Pilot Study","authors":"Sangjun Eom, S. Kim, Yihang Jiang, Ryan Jay Chen, Ali R. Roghanizad, M. Rosenthal, J. Dunn, M. Gorlatova","doi":"10.1109/VRW58643.2023.00042","DOIUrl":"https://doi.org/10.1109/VRW58643.2023.00042","url":null,"abstract":"Dialectical behavior therapy (DBT) is an evidence-based psychotherapy that helps patients learn skills to regulate emotions as a central strategy to improve life functioning. However, DBT skills require a long-term and consistent commitment, typically to group therapy over the course of months. Patients who might benefit may find this approach undesirable; it can be challenging to transfer learning from therapy sessions to daily life, and there is no way to personalize skills learning based on individualized needs. In this paper we propose the use of Augmented Reality (AR) and digital biomarkers to enhance DBT skill exercises to be more immersive and personalized by using physiological data as real-time feedback. To explore the feasibility of AR-based DBT skill implementation, we developed AR-based DBT skill exercises that manipulate the user's thermal perception by visualizing different thermal information in holograms. We conducted a user study to evaluate the impact of AR in changing the thermal perception and emotional states of the user with an analysis of physiological data collected from wearable devices.","PeriodicalId":412598,"journal":{"name":"2023 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130602121","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Nikolaos Chatziantoniou, Akimi Oyanagi, Kenichiro Ito, K. Aoyama, H. Kuzuoka, Tomohiro Amemiya
{"title":"Inducing joint attention between users by visual guidance with blur effects","authors":"Nikolaos Chatziantoniou, Akimi Oyanagi, Kenichiro Ito, K. Aoyama, H. Kuzuoka, Tomohiro Amemiya","doi":"10.1109/VRW58643.2023.00237","DOIUrl":"https://doi.org/10.1109/VRW58643.2023.00237","url":null,"abstract":"Attracting a learner's attention towards desired regions is imperative for immersive training applications. However, direct view manipulation can induce cybesickness and overt visual cues decrease immersion and distract from the training process. We propose and evaluate a technique to subtly and effortlessly guide the follower's visual attention, by blurring their field of view based on the leader's head orientation. We compared the performance of our technique against a cross-shaped head gaze indicator in alignment of moving viewport and object searching tasks. Results suggest that our technique can prompt joint attention between users in shared perspective virtual reality systems, with less induced cybersickness, workload and distraction.","PeriodicalId":412598,"journal":{"name":"2023 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130858919","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Multimodal Activity Detection for Natural Interaction with Virtual Human","authors":"Kai Wang, Shiguo Lian, Haiyan Sang, Wen Liu, Zhaoxiang Liu, Fuyuan Shi, Hui Deng, Zeming Sun, Zezhou Chen","doi":"10.1109/VRW58643.2023.00178","DOIUrl":"https://doi.org/10.1109/VRW58643.2023.00178","url":null,"abstract":"Natural face-to-face human-robot conversation is one of the most important features for virtual human in virtual reality and metaverse. However, the unintended wake-up of robot is often activated with only Voice Activity Detection (VAD). To address this issue, we propose a Multimodal Activity Detection (MAD) scheme, which considers not only voice but also gaze, lip-movement and talking content to decide whether the person is activating the robot. A dataset for large screen-based virtual human conversation is collected from various challenging cases. The experimental results show that the proposed MAD greatly outperforms VAD-only method.","PeriodicalId":412598,"journal":{"name":"2023 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130051018","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Linfeng Wu, Karen B. Chen, Brian Sekelsky, Matthew Peterson, Tyler Harper-Gampp, Cesar Delgado
{"title":"Shrink or grow the kids? Scale cognition in an immersive virtual environment for K-12 summer camp","authors":"Linfeng Wu, Karen B. Chen, Brian Sekelsky, Matthew Peterson, Tyler Harper-Gampp, Cesar Delgado","doi":"10.1109/VRW58643.2023.00203","DOIUrl":"https://doi.org/10.1109/VRW58643.2023.00203","url":null,"abstract":"Virtual reality (VR) has been widely used for education and affords embodied learning experiences. Here we describe: Scale Worlds (SW), an immersive virtual environment to allow users to shrink or grow by powers of ten (10X) and experience entities from molecular to astronomical levels; and students' impressions and outcomes from experiencing SW in a CAVE (Figure 1) during experiential summer outreach sessions. Data collected from post-visit surveys of 69 students, and field observations, revealed that VR technologies: enabled interactive learning experiences; encouraged active engagement and discussions among participating students; enhanced the understanding of size and scale; and increased interest in STEM careers.","PeriodicalId":412598,"journal":{"name":"2023 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"36 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128694692","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}