Virtual RealityPub Date : 2024-02-22DOI: 10.1007/s10055-024-00957-6
Svetlana Wähnert, Ulrike Schäfer
{"title":"Sensorimotor adaptation in virtual reality: Do instructions and body representation influence aftereffects?","authors":"Svetlana Wähnert, Ulrike Schäfer","doi":"10.1007/s10055-024-00957-6","DOIUrl":"https://doi.org/10.1007/s10055-024-00957-6","url":null,"abstract":"<p>Perturbations in virtual reality (VR) lead to sensorimotor adaptation during exposure, but also to aftereffects once the perturbation is no longer present. An experiment was conducted to investigate the impact of different task instructions and body representation on the magnitude and the persistence of these aftereffects. Participants completed the paradigm of sensorimotor adaptation in VR. They were assigned to one of three groups: control group, misinformation group or arrow group. The misinformation group and the arrow group were each compared to the control group to examine the effects of instruction and body representation. The misinformation group was given the incorrect instruction that in addition to the perturbation, a random error component was also built into the movement. The arrow group was presented a virtual arrow instead of a virtual hand. It was hypothesised that both would lead to a lower magnitude and persistence of the aftereffect because the object identity between hand and virtual representation would be reduced, and errors would be more strongly attributed to external causes. Misinformation led to lower persistence, while the arrow group showed no significant differences compared to the control group. The results suggest that information about the accuracy of the VR system can influence the aftereffects, which should be considered when developing VR instructions. No effects of body representation were found. One possible explanation is that the manipulated difference between abstract and realistic body representation was too small in terms of object identity.</p>","PeriodicalId":23727,"journal":{"name":"Virtual Reality","volume":"43 1","pages":""},"PeriodicalIF":4.2,"publicationDate":"2024-02-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139954282","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Virtual RealityPub Date : 2024-02-19DOI: 10.1007/s10055-023-00937-2
Iason Karakostas, Aikaterini Valakou, Despoina Gavgiotaki, Zinovia Stefanidi, Ioannis Pastaltzidis, Grigorios Tsipouridis, Nikolaos Kilis, Konstantinos C. Apostolakis, Stavroula Ntoa, Nikolaos Dimitriou, George Margetis, Dimitrios Tzovaras
{"title":"A real-time wearable AR system for egocentric vision on the edge","authors":"Iason Karakostas, Aikaterini Valakou, Despoina Gavgiotaki, Zinovia Stefanidi, Ioannis Pastaltzidis, Grigorios Tsipouridis, Nikolaos Kilis, Konstantinos C. Apostolakis, Stavroula Ntoa, Nikolaos Dimitriou, George Margetis, Dimitrios Tzovaras","doi":"10.1007/s10055-023-00937-2","DOIUrl":"https://doi.org/10.1007/s10055-023-00937-2","url":null,"abstract":"<p>Real-time performance is critical for Augmented Reality (AR) systems as it directly affects responsiveness and enables the timely rendering of virtual content superimposed on real scenes. In this context, we present the DARLENE wearable AR system, analysing its specifications, overall architecture and core algorithmic components. DARLENE comprises AR glasses and a wearable computing node responsible for several time-critical computation tasks. These include computer vision modules developed for the real-time analysis of dynamic scenes supporting functionalities for instance segmentation, tracking and pose estimation. To meet real-time requirements in limited resources, concrete algorithmic adaptations and design choices are introduced. The proposed system further supports real-time video streaming and interconnection with external IoT nodes. To improve user experience, a novel approach is proposed for the adaptive rendering of AR content by considering the user’s stress level, the context of use and the environmental conditions for adjusting the level of presented information towards enhancing their situational awareness. Through extensive experiments, we evaluate the performance of individual components and end-to-end pipelines. As the proposed system targets time-critical security applications where it can be used to enhance police officers’ situational awareness, further experimental results involving end users are reported with respect to overall user experience, workload and evaluation of situational awareness.</p>","PeriodicalId":23727,"journal":{"name":"Virtual Reality","volume":"105 1","pages":""},"PeriodicalIF":4.2,"publicationDate":"2024-02-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139926913","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Virtual RealityPub Date : 2024-02-14DOI: 10.1007/s10055-023-00930-9
Logan Clark, Mohamad El Iskandarani, Sara Riggs
{"title":"Reaching interactions in virtual reality: the effect of movement direction, hand dominance, and hemispace on the kinematic properties of inward and outward reaches","authors":"Logan Clark, Mohamad El Iskandarani, Sara Riggs","doi":"10.1007/s10055-023-00930-9","DOIUrl":"https://doi.org/10.1007/s10055-023-00930-9","url":null,"abstract":"<p>Recent literature has revealed that when users reach to select objects in VR, they can adapt how they move (i.e., the kinematic properties of their reaches) depending on the: (1) direction they move, (2) hand they use, and (3) side of the body where the movement occurs. In the present work, we took a more detailed look at how kinematic properties of reaching movements performed in VR change as a function of movement direction for reaches performed on each side of the body using each hand. We focused on reaches in 12 different directions that either involved moving inward (toward the body midline) or outward (away from the body midline). Twenty users reached in each direction on both left and right sides of their body, using both their dominant and non-dominant hands. The results provided a fine-grained account of how kinematic properties of virtual hand reaches change as a function of <i>movement direction</i> when users reach on either side of their body using either hand. The findings provide practitioners insights on how to interpret the kinematic properties of reaching behaviors in VR, which has applicability in emerging contexts that include detecting VR usability issues and using VR for stroke rehabilitation.</p>","PeriodicalId":23727,"journal":{"name":"Virtual Reality","volume":"61 1","pages":""},"PeriodicalIF":4.2,"publicationDate":"2024-02-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139757598","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Virtual RealityPub Date : 2024-02-13DOI: 10.1007/s10055-024-00952-x
Stefano Masneri, Ana Domínguez, Guillermo Pacho, Mikel Zorrilla, Mikel Larrañaga, Ana Arruarte
{"title":"A collaborative AR application for education: from architecture design to user evaluation","authors":"Stefano Masneri, Ana Domínguez, Guillermo Pacho, Mikel Zorrilla, Mikel Larrañaga, Ana Arruarte","doi":"10.1007/s10055-024-00952-x","DOIUrl":"https://doi.org/10.1007/s10055-024-00952-x","url":null,"abstract":"<p>Augmented reality applications can be used in an educational context to facilitate learning. In particular, augmented reality has been successfully used as a tool to boost students’ engagement and to improve their understanding of complex topics. Despite this, augmented reality usage is still not common in schools and it still offers mostly individual experiences, lacking collaboration capabilities which are of paramount importance in a learning environment. This work presents an application called <i>ARoundTheWorld</i>, a multiplatform augmented reality application for education. It is based on a software architecture, designed with the help of secondary school teachers, that provides interoperability, multi-user support, integration with learning management systems and data analytics capabilities, thus simplifying the development of collaborative augmented reality learning experiences. The application has been tested by 44 students and 3 teachers from 3 different educational institutions to evaluate the usability as well as the impact of collaboration functionalities in the students’ engagement. Qualitative and quantitative results show that the application fulfils all the design objectives identified by teachers as key elements for augmented reality educational applications. Furthermore, the application was positively evaluated by the students and it succeeded in promoting collaborative behaviour. These results show that <i>ARoundTheWorld</i>, and other applications built using the same architecture, could be easily developed and successfully integrated into existing schools curricula.</p>","PeriodicalId":23727,"journal":{"name":"Virtual Reality","volume":"5 1","pages":""},"PeriodicalIF":4.2,"publicationDate":"2024-02-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139757676","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Virtual RealityPub Date : 2024-02-12DOI: 10.1007/s10055-024-00939-8
Juan C. Morales-Vega, Laura Raya, Manuel Rubio-Sánchez, Alberto Sanchez
{"title":"A virtual reality data visualization tool for dimensionality reduction methods","authors":"Juan C. Morales-Vega, Laura Raya, Manuel Rubio-Sánchez, Alberto Sanchez","doi":"10.1007/s10055-024-00939-8","DOIUrl":"https://doi.org/10.1007/s10055-024-00939-8","url":null,"abstract":"<p>In this paper, we present a virtual reality interactive tool for generating and manipulating visualizations for high-dimensional data in a natural and intuitive stereoscopic way. Our tool offers support for a diverse range of dimensionality reduction (DR) algorithms, enabling the transformation of complex data into insightful 2D or 3D representations within an immersive VR environment. The tool also allows users to include annotations with a virtual pen using hand tracking, to assign class labels to the data observations, and to perform simultaneous visualization with other users within the 3D environment to facilitate collaboration.</p>","PeriodicalId":23727,"journal":{"name":"Virtual Reality","volume":"38 1","pages":""},"PeriodicalIF":4.2,"publicationDate":"2024-02-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139757424","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Virtual RealityPub Date : 2024-02-09DOI: 10.1007/s10055-023-00914-9
{"title":"HoloGCS: mixed reality-based ground control station for unmanned aerial vehicle","authors":"","doi":"10.1007/s10055-023-00914-9","DOIUrl":"https://doi.org/10.1007/s10055-023-00914-9","url":null,"abstract":"<h3>Abstract</h3> <p>Human–robot interaction (HRI), which studies the interaction between robots and humans, appears as a promising research idea for the future of smart factories. In this study, HoloLens as ground control station (HoloGCS) is implemented, and its performance is discussed. HoloGCS is a mixed reality-based system for controlling and monitoring unmanned aerial vehicles (UAV). The system incorporates HRI through speech commands and video streaming, enabling UAV teleoperation. HoloGCS provides a user interface that allows operators to monitor and control the UAV easily. To demonstrate the feasibility of the proposed systems, a user case study (user testing and SUS-based questionnaire) was performed to gather qualitative results. In addition, throughput, RTT, latency, and speech accuracy were also gathered and analyzed to evaluate quantitative results.</p>","PeriodicalId":23727,"journal":{"name":"Virtual Reality","volume":"18 1","pages":""},"PeriodicalIF":4.2,"publicationDate":"2024-02-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139757702","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Virtual RealityPub Date : 2024-02-09DOI: 10.1007/s10055-024-00938-9
M.-Carmen Juan, Cora Hidaldo, Damian Mifsut
{"title":"A mixed reality application for total hip arthroplasty","authors":"M.-Carmen Juan, Cora Hidaldo, Damian Mifsut","doi":"10.1007/s10055-024-00938-9","DOIUrl":"https://doi.org/10.1007/s10055-024-00938-9","url":null,"abstract":"<p>Total hip arthroplasty (or total hip replacement) is the current surgical solution for the treatment of advanced coxarthrosis, with the objective of providing mobility and pain relief to patients. For this purpose, surgery can be planned using preoperative images acquired from the patient and navigation systems can also be used during the intervention. Robots have also been used to assist in interventions. In this work, we propose a new mixed reality application for total hip arthroplasty. The surgeon only has to wear HoloLens 2. The application does not require acquiring preoperative or intraoperative images of the patient and uses hand interaction. Interaction is natural and intuitive. The application helps the surgeon place a virtual acetabular cup onto the patient's acetabulum as well as define its diameter. Similarly, a guide for drilling and implant placement is defined, establishing the abduction and anteversion angles. The surgeon has a direct view of the operating field at all times. For validation, the values of the abduction and anteversion angles offered by the application in 20 acetabular cup placements have been compared with real values (ground-truth). From the results, the mean (standard deviation) is 0.375 (0.483) degrees for the error in the anteversion angle and 0.1 (0.308) degrees for the abduction angle, with maximum discrepancies of 1 degree. A study was also carried out on a cadaver, in which a surgeon verified that the application is suitable to be transferred to routine clinical practice, helping in the guidance process for the implantation of a total hip prosthesis.</p>","PeriodicalId":23727,"journal":{"name":"Virtual Reality","volume":"2020 1","pages":""},"PeriodicalIF":4.2,"publicationDate":"2024-02-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139757587","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Virtual RealityPub Date : 2024-02-05DOI: 10.1007/s10055-023-00903-y
Jesús Moreno-Arjonilla, Alfonso López-Ruiz, J. Roberto Jiménez-Pérez, José E. Callejas-Aguilera, Juan M. Jurado
{"title":"Eye-tracking on virtual reality: a survey","authors":"Jesús Moreno-Arjonilla, Alfonso López-Ruiz, J. Roberto Jiménez-Pérez, José E. Callejas-Aguilera, Juan M. Jurado","doi":"10.1007/s10055-023-00903-y","DOIUrl":"https://doi.org/10.1007/s10055-023-00903-y","url":null,"abstract":"<p>Virtual reality (VR) has evolved substantially beyond its initial remit of gaming and entertainment, catalyzed by advancements such as improved screen resolutions and more accessible devices. Among various interaction techniques introduced to VR, eye-tracking stands out as a pivotal development. It not only augments immersion but offers a nuanced insight into user behavior and attention. This precision in capturing gaze direction has made eye-tracking instrumental for applications far beyond mere interaction, influencing areas like medical diagnostics, neuroscientific research, educational interventions, and architectural design, to name a few. Though eye-tracking’s integration into VR has been acknowledged in prior reviews, its true depth, spanning the intricacies of its deployment to its broader ramifications across diverse sectors, has been sparsely explored. This survey undertakes that endeavor, offering a comprehensive overview of eye-tracking’s state of the art within the VR landscape. We delve into its technological nuances, its pivotal role in modern VR applications, and its transformative impact on domains ranging from medicine and neuroscience to marketing and education. Through this exploration, we aim to present a cohesive understanding of the current capabilities, challenges, and future potential of eye-tracking in VR, underscoring its significance and the novelty of our contribution.</p>","PeriodicalId":23727,"journal":{"name":"Virtual Reality","volume":"214 1","pages":""},"PeriodicalIF":4.2,"publicationDate":"2024-02-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139757637","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Virtual RealityPub Date : 2024-02-02DOI: 10.1007/s10055-023-00919-4
Dohui Lee, Sohyun Won, Jiwon Kim, Hyuk-Yoon Kwon
{"title":"ARGo: augmented reality-based mobile Go stone collision game","authors":"Dohui Lee, Sohyun Won, Jiwon Kim, Hyuk-Yoon Kwon","doi":"10.1007/s10055-023-00919-4","DOIUrl":"https://doi.org/10.1007/s10055-023-00919-4","url":null,"abstract":"<p>In this study, we present a mobile Go stone collision game based on augmented reality, which we call ARGo, inspired by the traditional Korean board game, Alkkagi. ARGo aims to resolve two main issues: (1) the portability and space constraints of the original Alkkagi and (2) the limited sense of reality due to the touchscreen-based interface of the existing mobile Alkkagi games. To improve a sense of the reality of the game, ARGo provides a gameplay interface similar to the original Alkkagi by recognizing the user‘s hand motion based on AR. Additionally, it provides a customization mechanism for each user to improve the recognition of the hand motion and the strength of the attack considering each user‘s characteristics. Finally, we make the following three main contributions. First, we employ the automata theory to design the game and collision scenarios between stones. Consequently, we can clearly define the complicated states incurred by AR-based motion recognition and collisions between virtual objects. Second, we propose a collision equation based on Continuous Collision Detection tailored to ARGo, i.e., Go stones and their collisions. Through experimental studies, we demonstrate that the collision equation enables the simulation of the exact collision effects. Third, through user experience studies, we verify the effectiveness of ARGo by showing the effects of the functions implemented in ARGo and its superiority over the existing mobile game Alkkagi Mania.</p>","PeriodicalId":23727,"journal":{"name":"Virtual Reality","volume":"14 1","pages":""},"PeriodicalIF":4.2,"publicationDate":"2024-02-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139668976","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Virtual RealityPub Date : 2024-01-30DOI: 10.1007/s10055-023-00894-w
Miguel García García, Yannick Sauer, Tamara Watson, Siegfried Wahl
{"title":"Virtual reality (VR) as a testing bench for consumer optical solutions: a machine learning approach (GBR) to visual comfort under simulated progressive addition lenses (PALs) distortions","authors":"Miguel García García, Yannick Sauer, Tamara Watson, Siegfried Wahl","doi":"10.1007/s10055-023-00894-w","DOIUrl":"https://doi.org/10.1007/s10055-023-00894-w","url":null,"abstract":"<p>For decades, manufacturers have attempted to reduce or eliminate the optical aberrations that appear on the progressive addition lens’ surfaces during manufacturing. Besides every effort made, some of these distortions are inevitable given how lenses are fabricated, where in fact, astigmatism appears on the surface and cannot be entirely removed, or where non-uniform magnification becomes inherent to the power change across the lens. Some presbyopes may refer to certain discomfort when wearing these lenses for the first time, and a subset of them might never adapt. Developing, prototyping, testing and purveying those lenses into the market come at a cost, which is usually reflected in the retail price. This study aims to test the feasibility of virtual reality (VR) for testing customers’ satisfaction with these lenses, even before getting them onto production. VR offers a controlled environment where different parameters affecting progressive lens comforts, such as distortions, image displacement or optical blurring, can be inspected separately. In this study, the focus was set on the distortions and image displacement, not taking blur into account. Behavioural changes (head and eye movements) were recorded using the built-in eye tracker. We found participants were significantly more displeased in the presence of highly distorted lens simulations. In addition, a gradient boosting regressor was fitted to the data, so predictors of discomfort could be unveiled, and ratings could be predicted without performing additional measurements.</p>","PeriodicalId":23727,"journal":{"name":"Virtual Reality","volume":"70 1","pages":""},"PeriodicalIF":4.2,"publicationDate":"2024-01-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139648400","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}