Virtual RealityPub Date : 2024-07-19DOI: 10.1007/s10055-024-01027-7
Mantaj Singh, Peter Smitham, Suyash Jain, Christopher Day, Thomas Nijman, Dan George, David Neilly, Justin de Blasio, Michael Gilmore, Tiffany K. Gill, Susanna Proudman, Gavin Nimon
{"title":"Exploring the viability of Virtual Reality as a teaching method for knee aspiration","authors":"Mantaj Singh, Peter Smitham, Suyash Jain, Christopher Day, Thomas Nijman, Dan George, David Neilly, Justin de Blasio, Michael Gilmore, Tiffany K. Gill, Susanna Proudman, Gavin Nimon","doi":"10.1007/s10055-024-01027-7","DOIUrl":"https://doi.org/10.1007/s10055-024-01027-7","url":null,"abstract":"<p>Knee arthrocentesis is a simple procedure commonly performed by general practitioners and junior doctors. As such, doctors should be competent and comfortable in performing the technique by themselves; however, they need to be adequately trained. The best method to ensure practitioner proficiency is by optimizing teaching at an institutional level, thus, educating all future doctors in the procedure. However, the Coronavirus Disease 19 (COVID-19) pandemic caused significant disruption to hospital teaching for medical students which necessitated investigating the effectiveness of virtual reality (VR) as a platform to emulate hospital teaching of knee arthrocentesis. A workshop was conducted with 100 fourth year medical students divided into three Groups: A, B and C, each receiving a pre-reading online lecture. Group A was placed in an Objective Structured Clinical Examination (OSCE) station where they were assessed by a blinded orthopaedic surgeon using the OSCE assessment rubric. Group B undertook a hands-on practice station prior to assessment, while Group C received a VR video (courtesy of the University of Adelaide’s Health Simulation) in the form of VR headset or 360° surround immersion room and hands-on station followed by the OSCE. Upon completion of the workshop, students completed a questionnaire on their confidence with the procedure and the practicality of the VR station. OSCE scores were compared between Groups B and C to investigate the educational value of VR teaching. On average, students with VR headsets reported higher confidence with the procedure and were more inclined to undertake it on their own. Students in Group C who used the VR station prior to assessment scored higher than the non-VR Groups (Group A, 56%; Group B, 67%; Group C 83%). Students in Group A had statistically significant results on average compared to those in Group B (t(69) = 3.003, <i>p</i> = 0.003), as do students in Group B compared to Group C (t(62) = 5.400, <i>p</i> < 0.001). Within Group C students who were given VR headsets scored higher than immersion room students. The VR headset was beneficial in providing students with a representation of how knee arthrocentesis may be conducted in the hospital setting. While VR will not replace conventional in-hospital teaching, given current technological limitations, it serves as an effective teaching aid for arthrocentesis and has many other potential applications for a wide scope of medicine and surgical training.</p>","PeriodicalId":23727,"journal":{"name":"Virtual Reality","volume":"12 1","pages":""},"PeriodicalIF":4.2,"publicationDate":"2024-07-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141742636","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Virtual RealityPub Date : 2024-07-18DOI: 10.1007/s10055-024-01038-4
Dominik Spinczyk, Grzegorz Rosiak, Krzysztof Milczarek, Dariusz Konecki, Jarosław Żyłkowski, Jakub Franke, Maciej Pech, Karl Rohmer, Karol Zaczkowski, Ania Wolińska-Sołtys, Piotr Sperka, Dawid Hajda, Ewa Piętka
{"title":"Towards overcoming barriers to the clinical deployment of mixed reality image-guided navigation systems supporting percutaneous ablation of liver focal lesions","authors":"Dominik Spinczyk, Grzegorz Rosiak, Krzysztof Milczarek, Dariusz Konecki, Jarosław Żyłkowski, Jakub Franke, Maciej Pech, Karl Rohmer, Karol Zaczkowski, Ania Wolińska-Sołtys, Piotr Sperka, Dawid Hajda, Ewa Piętka","doi":"10.1007/s10055-024-01038-4","DOIUrl":"https://doi.org/10.1007/s10055-024-01038-4","url":null,"abstract":"<p>In recent years, we have observed a rise in the popularity of minimally invasive procedures for treating liver tumours, with percutaneous thermoablation being one of them, conducted using image-guided navigation systems with mixed reality technology. However, the application of this method requires adequate training in using the employed system. In our study, we assessed which skills pose the greatest challenges in performing such procedures. The article proposes a training module characterized by an innovative approach: the possibility of practicing the diagnosis, planning, execution stages and the physical possibility of performing the execution stage on the radiological phantom of the abdominal cavity. The proposed approach was evaluated by designing a set of 4 exercises corresponding to the 3 phases mentioned. To the research group included 10 radiologists and 5 residents in the study. Based on 20 clinical cases of liver tumors subjected to percutaneous thermoablation, we developed assessment tasks evaluating four skill categories: head-mounted display (HMD), ultrasound (US)/computed tomography (CT) image fusion interpretation, tracking system use, and the ability to insert a needle<b>.</b> The results were presented using the Likert scale. The results of our study indicate that the most challenging aspect for radiology specialists is adapting to HMD gesture control, while residents point to intraoperative images of fusion and respiratory movements in the liver as the most problematic. In terms of improving the ability to perform procedures on new patients, the module also allows you to create a new hologram for a different clinical case.</p>","PeriodicalId":23727,"journal":{"name":"Virtual Reality","volume":"70 1","pages":""},"PeriodicalIF":4.2,"publicationDate":"2024-07-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141742471","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Virtual RealityPub Date : 2024-07-10DOI: 10.1007/s10055-024-01031-x
Aleksandra Zheleva, Lieven De Marez, Durk Talsma, Klaas Bombeke
{"title":"Intersecting realms: a cross-disciplinary examination of VR quality of experience research","authors":"Aleksandra Zheleva, Lieven De Marez, Durk Talsma, Klaas Bombeke","doi":"10.1007/s10055-024-01031-x","DOIUrl":"https://doi.org/10.1007/s10055-024-01031-x","url":null,"abstract":"<p>The advent of virtual reality (VR) technology has necessitated a reevaluation of quality of experience (QoE) models. While numerous recent efforts have been dedicated to creating comprehensive QoE frameworks it seems that the majority of the factors studied as potential influencers of QoE are often limited to single disciplinary viewpoints or specific user-related aspects. Furthermore, the majority of literature reviews in this domain seem to have predominantly focused on academic sources, overlooking industry insights. To address these points, the current research took an interdisciplinary literature review approach to examine QoE literature covering both academic and industry sources from diverse fields (i.e., psychology, ergonomics, user experience, communication science, and engineering). Based on this rich dataset, we created a QoE model that illustrated 252 factors grouped into four branches - user, system, context, and content. The main finding of this review emphasized the substantial gap in the current research landscape, where complex interactions among user, system, context, and content factors in VR are overlooked. The current research not only identified this crucial disparity in existing QoE studies but also provided a substantial online repository of over 200 QoE-related factors. The repository serves as an indispensable tool for future researchers aiming to construct a more holistic understanding of QoE.</p>","PeriodicalId":23727,"journal":{"name":"Virtual Reality","volume":"36 1","pages":""},"PeriodicalIF":4.2,"publicationDate":"2024-07-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141586942","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Virtual RealityPub Date : 2024-07-09DOI: 10.1007/s10055-024-01032-w
Ashlee Gronowski, David Caelum Arness, Jing Ng, Zhonglin Qu, Chng Wei Lau, Daniel Catchpoole, Quang Vinh Nguyen
{"title":"The impact of virtual and augmented reality on presence, user experience and performance of Information Visualisation","authors":"Ashlee Gronowski, David Caelum Arness, Jing Ng, Zhonglin Qu, Chng Wei Lau, Daniel Catchpoole, Quang Vinh Nguyen","doi":"10.1007/s10055-024-01032-w","DOIUrl":"https://doi.org/10.1007/s10055-024-01032-w","url":null,"abstract":"<p>The fast growth of virtual reality (VR) and augmented reality (AR) head-mounted displays provides a new medium for interactive visualisations and visual analytics. Presence is the experience of consciousness within extended reality, and it has the potential to increase task performance. This project studies the impact that a sense of presence has on data visualisation performance and user experience under AR and VR conditions. A within-subjects design recruited 38 participants to complete interactive visualisation tasks within the novel immersive data analytics system for genomic data in AR and VR, and measured speed, accuracy, preference, presence, and user satisfaction. Open-ended user experience responses were also collected. The results implied that VR was more conducive to efficiency, effectiveness, and user experience as well as offering insight into possible cognitive load benefits for VR users.</p>","PeriodicalId":23727,"journal":{"name":"Virtual Reality","volume":"21 1","pages":""},"PeriodicalIF":4.2,"publicationDate":"2024-07-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141573938","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Virtual RealityPub Date : 2024-07-09DOI: 10.1007/s10055-024-01030-y
Xiaotian Zhang, Weiping He, Mark Billinghurst, Yunfei Qin, Lingxiao Yang, Daisong Liu, Zenglei Wang
{"title":"Usability of visualizing position and orientation deviations for manual precise manipulation of objects in augmented reality","authors":"Xiaotian Zhang, Weiping He, Mark Billinghurst, Yunfei Qin, Lingxiao Yang, Daisong Liu, Zenglei Wang","doi":"10.1007/s10055-024-01030-y","DOIUrl":"https://doi.org/10.1007/s10055-024-01030-y","url":null,"abstract":"<p>Manual precise manipulation of objects is an essential skill in everyday life, and Augmented Reality (AR) is increasingly being used to support such operations. In this study, we investigate whether detailed visualizations of position and orientation deviations are helpful for AR-assisted manual precise manipulation of objects. We developed three AR instructions with different visualizations of deviations: the logical deviation baseline instruction, the precise numerical deviations-based instruction, and the intuitive color-mapped deviations-based instruction. All three instructions visualized the required directions for manipulation and the logical values of whether the object met the accuracy requirements. Additionally, the latter two instructions provided detailed visualizations of deviations through numerical text and color-mapping respectively. A user study was conducted with 18 participants to compare the three AR instructions. The results showed that there were no significant differences found in speed, accuracy, perceived ease-of-use, and perceived workload between the three AR instructions. We found that the visualizations of the required directions for manipulation and the logical values of whether the object met the accuracy requirements were sufficient to guide manual precise manipulation. The detailed visualizations of the real-time deviations could not improve the speed and accuracy of manual precise manipulation, and although they could improve the perceived ease-of-use and user experience, the effects were not significant. Based on the results, several recommendations were provided for designing AR instructions to support precise manual manipulation.</p>","PeriodicalId":23727,"journal":{"name":"Virtual Reality","volume":"46 1","pages":""},"PeriodicalIF":4.2,"publicationDate":"2024-07-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141573937","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Virtual RealityPub Date : 2024-07-06DOI: 10.1007/s10055-024-01028-6
Artem S. Yashin, Daniil S. Lavrov, Eugeny V. Melnichuk, Valery V. Karpov, Darisy G. Zhao, Ignat A. Dubynin
{"title":"Robot remote control using virtual reality headset: studying sense of agency with subjective distance estimates","authors":"Artem S. Yashin, Daniil S. Lavrov, Eugeny V. Melnichuk, Valery V. Karpov, Darisy G. Zhao, Ignat A. Dubynin","doi":"10.1007/s10055-024-01028-6","DOIUrl":"https://doi.org/10.1007/s10055-024-01028-6","url":null,"abstract":"<p>Mobile robots have many applications in the modern world. The autonomy of robots is increasing, but critical cases like search and rescue missions must involve the possibility of human intervention for ethical reasons and safety. To achieve effective human–robot interaction, the operator needs to have a sense of agency (SoA) over the activities of the robot. One possible way to increase one's SoA in remote control could be the use of VR technology. The remote control situation has some important features, so indicators of SoA need to be reproduced there independently. In our study, participants controlled a mobile robot using either a monitor or a VR-headset as an output device. In both cases, active control was contrasted with passive observation of the robot's movement. In each trial, participants estimated the distance traveled by the robot—a putative implicit indicator of SoA. A significant difference between subjective distance estimates was found in the active and passive conditions with the monitor, but not in the active and passive conditions with VR. The effect obtained in the monitor conditions suggests that distance estimates can be used as an implicit indicator of SoA in robot remote control. We believe that the lack of difference between the active and passive conditions in VR was caused by motion sickness due to a mismatch of visual and vestibular sensory cues, leading to a weakened SoA.</p>","PeriodicalId":23727,"journal":{"name":"Virtual Reality","volume":"87 1","pages":""},"PeriodicalIF":4.2,"publicationDate":"2024-07-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141573939","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Virtual RealityPub Date : 2024-07-02DOI: 10.1007/s10055-024-01026-8
Tae Hee Lee, Young Ju Jeong
{"title":"Spatial resolution measurement method for 3D displays from contrast modulation","authors":"Tae Hee Lee, Young Ju Jeong","doi":"10.1007/s10055-024-01026-8","DOIUrl":"https://doi.org/10.1007/s10055-024-01026-8","url":null,"abstract":"<p>Augmented Reality 3D head-up displays use a autostereoscopic 3D display as a panel. The 3D optical unit of autostereoscopic 3D displays controls the direction of the light rays in each pixel, allowing the users enjoy 3D world without glasses. However, these 3D optics cause image quality degradation. Deterioration of resolution has a serious impact on 3D image quality. Therefore, it is important to properly measure the 3D resolution according to 3D optics and analyze its impact. In this study, a method for measuring spatial resolution in 3D displays using contrast modulation is proposed. We describe a conventional 2D resolution measurement methods that are standardized. Based on the existing 2D resolution methods, we propose a 3D resolution method. The spatial and frequency signal responses of 3D displays were investigated. The first method is determined by the predominant frequency series. The second method is conducted by contrast modulation. Through experiments with 3D displays, 3D resolution was measured using the proposed method, and the relationship between the parameters and resolution of 3D optics was examined.</p>","PeriodicalId":23727,"journal":{"name":"Virtual Reality","volume":"154 1","pages":""},"PeriodicalIF":4.2,"publicationDate":"2024-07-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141530627","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Virtual RealityPub Date : 2024-07-02DOI: 10.1007/s10055-024-01021-z
Christyan Cruz Ulloa, David Domínguez, Jaime del Cerro, Antonio Barrientos
{"title":"Analysis of MR–VR tele-operation methods for legged-manipulator robots","authors":"Christyan Cruz Ulloa, David Domínguez, Jaime del Cerro, Antonio Barrientos","doi":"10.1007/s10055-024-01021-z","DOIUrl":"https://doi.org/10.1007/s10055-024-01021-z","url":null,"abstract":"<p>The development of immersive technologies in recent years has facilitated the control and execution of tasks at a high level of complexity in robotic systems. On the other hand, exploration and manipulation tasks in unknown environments have been one of the main challenges in search and rescue (SAR) robotics. Due to the complexity and uncertainty involved in autonomous manipulation tasks in unstructured environments, these are usually tele-operated initially. This article addresses a comparative study between Mixed Reality (MR—Hololens) and Virtual Reality (VR—HTC-Vive) methods for teleoperating legged-manipulator robots in the context of search and rescue. For this purpose, a teleoperation robotics method was established to address the comparison, developing VR–MR interfaces with the same contextualization and operational functionality for mission management and robot control of a robotic set composed of a quadrupedal robot equipped with a 6 degrees of freedom (6DoF) manipulator, by a user using hand gestures. A set of metrics is proposed for the comparative evaluation of the interfaces considering parameters that allow analyzing operability in the context of the mission (latencies, physical parameters of the equipment, etc.), as well as from the aspect of operator performance (required training, confidence levels, etc.). The experimental phase was conducted using both on-site and remote operations to evaluate and categorize the advantages and disadvantages of each method.</p>","PeriodicalId":23727,"journal":{"name":"Virtual Reality","volume":"121 1","pages":""},"PeriodicalIF":4.2,"publicationDate":"2024-07-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141514373","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Virtual RealityPub Date : 2024-07-01DOI: 10.1007/s10055-024-01019-7
Riham Alieldin, Sarah Peyre, Anne Nofziger, Raffaella Borasi
{"title":"Effectiveness of immersive virtual reality in teaching empathy to medical students: a mixed methods study","authors":"Riham Alieldin, Sarah Peyre, Anne Nofziger, Raffaella Borasi","doi":"10.1007/s10055-024-01019-7","DOIUrl":"https://doi.org/10.1007/s10055-024-01019-7","url":null,"abstract":"<p>Empathy in healthcare has been associated with positive outcomes such as increased patient satisfaction and reduced medical errors. However, research has indicated a decline in empathy among medical professionals. This study examined the effectiveness of Immersive Virtual Reality (IVR) for empathy training in medical education. A convergent mixed methods pretest posttest design was utilized. Participants were 1st-year medical students who engaged in an empathy training IVR educational intervention around a scenario depicting older adults struggling with social isolation. Jefferson Scale of Empathy (JSE) questionnaire was administered before and after the intervention to measure the change in empathy levels. Data were analyzed using a paired sample t-test on the pre-/post-test JSE empathy scores to assess the change in empathy scores. Nineteen qualitative semi structured interviews were conducted immediately after the IVR experience and follow-up interviews were conducted six months later. Qualitative data collected from the interviews’ transcripts were analyzed using a thematic and content analysis approach to capture individual experiences. Students (n = 19) scored 5.94 points higher on the posttest JSE questionnaire compared to pretest (p < 0.01) indicating an improvement in empathy levels. Qualitative analysis showed that the IVR training was well received by the students as a valuable empathy-teaching tool. Immersion, presence, and embodiment were identified as the main features of IVR technology that enhanced empathy and understanding of patients’ experiences. The debriefing sessions were identified as a key element of the training. IVR-based training could be an effective teaching tool for empathy training in medical education and one that is well received by learners. Results from the study offer preliminary evidence that using IVR to evoke empathy is achievable.</p>","PeriodicalId":23727,"journal":{"name":"Virtual Reality","volume":"52 1","pages":""},"PeriodicalIF":4.2,"publicationDate":"2024-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141509036","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Virtual RealityPub Date : 2024-06-26DOI: 10.1007/s10055-024-01020-0
Sergio Valmorisco, Laura Raya, Alberto Sanchez
{"title":"Enabling personalized VR experiences: a framework for real-time adaptation and recommendations in VR environments","authors":"Sergio Valmorisco, Laura Raya, Alberto Sanchez","doi":"10.1007/s10055-024-01020-0","DOIUrl":"https://doi.org/10.1007/s10055-024-01020-0","url":null,"abstract":"<p>The personalization of user experiences through recommendation systems has been extensively explored in Internet applications, but this has yet to be fully addressed in Virtual Reality (VR) environments. The complexity of managing geometric 3D data, computational load, and natural interactions poses significant challenges in real-time adaptation in these immersive experiences. However, tailoring VR environments to individual user needs and interests holds promise for enhancing user experiences. In this paper, we present Virtual Reality Environment Adaptation through Recommendations (<i>VR-EAR</i>), a framework designed to address this challenge. <i>VR-EAR</i> employs customizable object metadata and a hybrid recommendation system modeling implicit user feedback in VR environments. We utilize VR optimization techniques to ensure efficient performance. To evaluate our framework, we designed a virtual store where product locations dynamically adjust based on user interactions. Our results demonstrate the effectiveness of <i>VR-EAR</i> in adapting and personalizing VR environments in real time. domains.</p>","PeriodicalId":23727,"journal":{"name":"Virtual Reality","volume":"5 1","pages":""},"PeriodicalIF":4.2,"publicationDate":"2024-06-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141509038","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}