{"title":"Exploring sex differences in collaborative virtual environments for participation equality and user experience","authors":"Yifan Yang, Sheng Zhang, Xu Sun, Xingyi Zhang, Xiaotong Sun, Ying Jing, Canjun Yang","doi":"10.1007/s10055-024-01022-y","DOIUrl":"https://doi.org/10.1007/s10055-024-01022-y","url":null,"abstract":"<p>Communication technology plays a crucial role in facilitating remote collaborative work. This study investigated sex differences in Perceived Participation Equality and User Experience across different communication formats, i.e., face-to-face communication, conventional video conferences, and Virtual Reality (VR). An empirical study was conducted involving 15 groups, each comprising three participants, who engaged in a decision-making task. A research model was developed to evaluate the interplay between perceived participation equality, empathy, and immersion. This model was employed across three communication conditions and included both male and female participants. These findings on sex differences in user experience could help create a connected, cohesive, and productive remote collaborative work environment.</p>","PeriodicalId":23727,"journal":{"name":"Virtual Reality","volume":"24 1","pages":""},"PeriodicalIF":4.2,"publicationDate":"2024-08-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142192752","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Virtual RealityPub Date : 2024-08-24DOI: 10.1007/s10055-024-01046-4
Alison O’Meara, Tadgh Connery, Jason Chan, Cleidi Hearn, Marica Cassarino, Annalisa Setti
{"title":"Phone-based virtual exploration of green space increases positive affect in students with test anxiety: a pre-post experimental study with qualitative insights","authors":"Alison O’Meara, Tadgh Connery, Jason Chan, Cleidi Hearn, Marica Cassarino, Annalisa Setti","doi":"10.1007/s10055-024-01046-4","DOIUrl":"https://doi.org/10.1007/s10055-024-01046-4","url":null,"abstract":"<p>Nature confers a host of benefits including recovering from stress, replenishing attentional resources, improving mood, and decreasing negative thinking. Virtual nature, i.e. exposure to natural environments through technological means, has proven to also be efficacious in producing benefits, although more limitedly. Previous studies with immersive virtual reality with university students have shown that one bout of virtual nature can reduce negative affect in students with high test anxiety and can reduce feeling of worry and panic after several weeks of daily exposure. The present study aimed at replicating the effect of one bout of virtual nature on affect and extend it to cognition in a sample of university students with different levels of test anxiety. An inexpensive goggle + phone apparatus was utilized and the one bout of virtual nature was self-administered. 48 university students took part in the study, randomized between viewing a 360 degrees video of nature or of an urban environment. They completed the Positive and Negative Affect Schedule and the Cognitive Reflection Test before and after the exposure to the virtual environments and responded to open-ended questions about their experience of the intervention. Results showed improvements in positive affect in students with higher anxiety were obtained in the nature condition, no other effects were found. Qualitative appraisal indicated that participants in the nature condition felt more relaxed and focused, however the technical issues were detrimental to the benefits. In conclusion one bout of virtual nature could support students with higher test anxiety when confronted with examinations.</p>","PeriodicalId":23727,"journal":{"name":"Virtual Reality","volume":"396 1","pages":""},"PeriodicalIF":4.2,"publicationDate":"2024-08-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142192708","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Virtual RealityPub Date : 2024-08-23DOI: 10.1007/s10055-024-01043-7
Ting Qiu, Hong Li, Yongkang Chen, Hui Zeng, Shufang Qian
{"title":"Continuance intention toward VR games of intangible cultural heritage: A stimulus-organism-response perspective","authors":"Ting Qiu, Hong Li, Yongkang Chen, Hui Zeng, Shufang Qian","doi":"10.1007/s10055-024-01043-7","DOIUrl":"https://doi.org/10.1007/s10055-024-01043-7","url":null,"abstract":"<p>Virtual reality (VR) games have become a popular method to preserve and transmit intangible cultural heritage in recent years. However, empirical studies pertaining to motivations behind the continuance intention to play VR games featuring intangible cultural heritage have been limited. The objective of this study focuses on answering an essential question: what factors influence user’s continuance intention to play intangible cultural heritage VR games? Both Stimulus-Organism-Response Theory and Technology Acceptance Model (TAM) are considered to develop twelve hypotheses and build the research framework. A survey of 190 respondents was conducted, and the results were analyzed by using PLS-SEM. The results show that visual attractiveness, interactivity, and immersion are significant indicators in measuring users’ continuance intention to play. Additionally, perceived usefulness, perceived ease of use, and perceived enjoyment of VR games positively influence their continuance intention. This study enriches the research of intangible cultural heritage VR games. It also provides theoretical implications for scholars and design strategies for VR developers and designers.</p>","PeriodicalId":23727,"journal":{"name":"Virtual Reality","volume":"75 1","pages":""},"PeriodicalIF":4.2,"publicationDate":"2024-08-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142192709","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Virtual RealityPub Date : 2024-08-12DOI: 10.1007/s10055-024-01042-8
Walter Terkaj, Marcello Urgo, Péter Kovács, Erik Tóth, Marta Mondellini
{"title":"A framework for virtual learning in industrial engineering education: development of a reconfigurable virtual learning factory application","authors":"Walter Terkaj, Marcello Urgo, Péter Kovács, Erik Tóth, Marta Mondellini","doi":"10.1007/s10055-024-01042-8","DOIUrl":"https://doi.org/10.1007/s10055-024-01042-8","url":null,"abstract":"<p>Advances in digital factory technologies are offering great potential to innovate higher education, by enabling innovative learning approaches based on virtual laboratories that increase the involvement of students while delivering realistic experiences. This article introduces a framework for the development of virtual learning applications by addressing multidisciplinary requirements. The implementation of the framework can be eased by the use of the proposed virtual learning factory application (VLFA), an open-source solution that takes advantage of virtual reality to support innovative higher-education learning activities in industrial engineering. A complete design and development workflow is described, starting from the identification of the requirements, to the design of software modules and underlying technologies, up to the final implementation. The framework and the VLFA have been tested to implement a serious game related to the design and analysis of manufacturing systems, also collecting the feedback of students and teachers.</p>","PeriodicalId":23727,"journal":{"name":"Virtual Reality","volume":"13 1","pages":""},"PeriodicalIF":4.2,"publicationDate":"2024-08-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141949325","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Virtual RealityPub Date : 2024-08-05DOI: 10.1007/s10055-024-01029-5
Patrice Piette, Emilie Leblong, Romain Cavagna, Albert Murienne, Bastien Fraudet, Philippe Gallien
{"title":"A comparison of balance between real and virtual environments: differences, role of visual cues and full-body avatars, a quasi-experimental clinical study","authors":"Patrice Piette, Emilie Leblong, Romain Cavagna, Albert Murienne, Bastien Fraudet, Philippe Gallien","doi":"10.1007/s10055-024-01029-5","DOIUrl":"https://doi.org/10.1007/s10055-024-01029-5","url":null,"abstract":"<p>Virtual rehabilitation using Virtual Reality (VR) technology is a promising novel approach to rehabilitation. However, postural responses in VR differ significantly from real life. The introduction of an avatar or visual cues in VR could help rectify this difference. An initial session was used to assess static and dynamic balance performances between VR and real life to set the reference values. A second session involved three VR conditions applied in a randomised order: i.e. full-body avatar, enhanced visual cues, or a combination of both conditions. Performances of the centre of pressure (COP) were recorded on a force plate. Seventy (70) people took part in the first session and 74 in the second. During the first session, a significant difference was observed in left static, right static and right dynamic COP distance (respectively SMD = − 0.40 [− 0.73, − 0.06], <i>p</i> = 0.02, − 0.33 [− 0.67, 0.00], <i>p</i> = 0.05, SMD = − 0.61 [− 0.95, − 0.27], <i>p</i> < 0.001) and a non-significant difference in the left dynamic, SMD = − 0.22 [− 0.56, 0.11], <i>p</i> = 0.19). During the second session it was observed that this difference was corrected mainly by reinforced visual information and to a lesser extent by the presence of a full-body avatar. Balance disruption triggered by the use of virtual reality can be offset by vertical visual information and/or by the presence of a full-body avatar. Further research is required on the effects of a full-body avatar.</p>","PeriodicalId":23727,"journal":{"name":"Virtual Reality","volume":"2 1","pages":""},"PeriodicalIF":4.2,"publicationDate":"2024-08-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141949326","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Virtual RealityPub Date : 2024-08-02DOI: 10.1007/s10055-024-01040-w
Inoussa Ouedraogo, Huyen Nguyen, Patrick Bourdot
{"title":"Immersive analytics with augmented reality in meteorology: an exploratory study on ontology and linked data","authors":"Inoussa Ouedraogo, Huyen Nguyen, Patrick Bourdot","doi":"10.1007/s10055-024-01040-w","DOIUrl":"https://doi.org/10.1007/s10055-024-01040-w","url":null,"abstract":"<p>Although Augmented Reality (AR) has been extensively studied in supporting Immersive Analytics (IA), there are still many challenges in visualising and interacting with big and complex datasets. To deal with these datasets, most AR applications utilise NoSQL databases for storing and querying data, especially for managing large volumes of unstructured or semi-structured data. However, NoSQL databases have limitations in their reasoning and inference capabilities, which can result in insufficient support for certain types of queries. To fill this gap, we aim to explore and evaluate whether an intelligent approach based on ontology and linked data can facilitate visual analytics tasks with big datasets on AR interface. We designed and implemented a prototype of this method for meteorological data analytics. An experiment was conducted to evaluate the use of a semantic database with linked data compared to a conventional approach in an AR-based immersive analytics system. The results significantly highlight the performance of semantic approach in helping the users analysing meteorological datasets and their subjective appreciation in working with the AR interface, which is enhanced with ontology and linked data.</p>","PeriodicalId":23727,"journal":{"name":"Virtual Reality","volume":"14 1","pages":""},"PeriodicalIF":4.2,"publicationDate":"2024-08-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141881479","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Virtual RealityPub Date : 2024-08-01DOI: 10.1007/s10055-024-01039-3
Woojin Cho, Taewook Ha, Ikbeom Jeon, Jinwoo Jeon, Tae-Kyun Kim, Woontack Woo
{"title":"Temporally enhanced graph convolutional network for hand tracking from an egocentric camera","authors":"Woojin Cho, Taewook Ha, Ikbeom Jeon, Jinwoo Jeon, Tae-Kyun Kim, Woontack Woo","doi":"10.1007/s10055-024-01039-3","DOIUrl":"https://doi.org/10.1007/s10055-024-01039-3","url":null,"abstract":"<p>We propose a robust 3D hand tracking system in various hand action environments, including hand-object interaction, which utilizes a single color image and a previous pose prediction as input. We observe that existing methods deterministically exploit temporal information in motion space, failing to address realistic diverse hand motions. Also, prior methods paid less attention to efficiency as well as robust performance, i.e., the balance issues between time and accuracy. The Temporally Enhanced Graph Convolutional Network (TE-GCN) utilizes a 2-stage framework to encode temporal information adaptively. The system establishes balance by adopting an adaptive GCN, which effectively learns the spatial dependency between hand mesh vertices. Furthermore, the system leverages the previous prediction by estimating the relevance across image features through the attention mechanism. The proposed method achieves state-of-the-art balanced performance on challenging benchmarks and demonstrates robust results on various hand motions in real scenes. Moreover, the hand tracking system is integrated into a recent HMD with an off-loading framework, achieving a real-time framerate while maintaining high performance. Our study improves the usability of a high-performance hand-tracking method, which can be generalized to other algorithms and contributes to the usage of HMD in everyday life. Our code with the HMD project will be available at https://github.com/UVR-WJCHO/TEGCN_on_Hololens2.</p>","PeriodicalId":23727,"journal":{"name":"Virtual Reality","volume":"2019 1","pages":""},"PeriodicalIF":4.2,"publicationDate":"2024-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141866649","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Virtual RealityPub Date : 2024-07-27DOI: 10.1007/s10055-024-01041-9
Ruowei Xiao, Rongzheng Zhang, Oğuz Buruk, Juho Hamari, Johanna Virkki
{"title":"Toward next generation mixed reality games: a research through design approach","authors":"Ruowei Xiao, Rongzheng Zhang, Oğuz Buruk, Juho Hamari, Johanna Virkki","doi":"10.1007/s10055-024-01041-9","DOIUrl":"https://doi.org/10.1007/s10055-024-01041-9","url":null,"abstract":"<p>Mixed reality (MR) games refer to games that integrate physical entities with digitally mediated contents. Currently, it entails game creators to integrate heterogeneous virtual and physical components, which is often time-consuming and labor-intensive, without the support of a coherent technology stack. The underlying technodiversity manifested by the research corpus suggests a complicated, multi-dimensional design space that goes beyond merely technical concerns. In this research, we adopted a research-through-design approach and proposed an MR game technology stack that facilitates flexible, low-code game development. As design grounding, we first surveyed 34 state-of-the-art studies, and results were synergized into three different spectra of technological affordances, respectively activity range, user interface and feedback control, to inform our next design process. We then went through an iterative prototyping phase and implemented an MR game development toolset. A co-design workshop was conducted, where we invited 15 participants to try the prototype tools and co-ideate the potential use scenarios for the proposed technology stack. First-hand user feedback was collected via questionnaires and semi-structured interviews. As a result, four conceptual game designs with three major design implications were generated, which conjointly reflect a broader understanding on MR gameful experience and contribute fresh insights to this emerging research domain.</p>","PeriodicalId":23727,"journal":{"name":"Virtual Reality","volume":"63 1","pages":""},"PeriodicalIF":4.2,"publicationDate":"2024-07-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141781336","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Virtual RealityPub Date : 2024-07-24DOI: 10.1007/s10055-024-01033-9
Ali Buwaider, Victor Gabriel El-Hajj, Alessandro Iop, Mario Romero, Walter C Jean, Erik Edström, Adrian Elmi-Terander
{"title":"Augmented reality navigation in external ventricular drain insertion—a systematic review and meta-analysis","authors":"Ali Buwaider, Victor Gabriel El-Hajj, Alessandro Iop, Mario Romero, Walter C Jean, Erik Edström, Adrian Elmi-Terander","doi":"10.1007/s10055-024-01033-9","DOIUrl":"https://doi.org/10.1007/s10055-024-01033-9","url":null,"abstract":"<p>External ventricular drain (EVD) insertion using the freehand technique is often associated with misplacements resulting in unfavorable outcomes. Augmented Reality (AR) has been increasingly used to complement conventional neuronavigation. The accuracy of AR guided EVD insertion has been investigated in several studies, on anthropomorphic phantoms, cadavers, and patients. This review aimed to assess the current knowledge and discuss potential benefits and challenges associated with AR guidance in EVD insertion. MEDLINE, EMBASE, and Web of Science were searched from inception to August 2023 for studies evaluating the accuracy of AR guidance for EVD insertion. Studies were screened for eligibility and accuracy data was extracted. The risk of bias was assessed using the Cochrane Risk of Bias Tool and the quality of evidence was assessed using the Newcastle-Ottawa-Scale. Accuracy was reported either as the average deviation from target or according to the Kakarla grading system. Of the 497 studies retrieved, 14 were included for analysis. All included studies were prospectively designed. Insertions were performed on anthropomorphic phantoms, cadavers, or patients, using several different AR devices and interfaces. Deviation from target ranged between 0.7 and 11.9 mm. Accuracy according to the Kakarla grading scale ranged between 82 and 96%. Accuracy was higher for AR compared to the freehand technique in all studies that had control groups. Current evidence demonstrates that AR is more accurate than free-hand technique for EVD insertion. However, studies are few, the technology developing, and there is a need for further studies on patients in relevant clinical settings.</p>","PeriodicalId":23727,"journal":{"name":"Virtual Reality","volume":"47 1","pages":""},"PeriodicalIF":4.2,"publicationDate":"2024-07-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141781331","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Virtual RealityPub Date : 2024-07-23DOI: 10.1007/s10055-024-01035-7
H. A. T. van Limpt-Broers, M. Postma, E. van Weelden, S. Pratesi, M. M. Louwerse
{"title":"Neurophysiological evidence for the overview effect: a virtual reality journey into space","authors":"H. A. T. van Limpt-Broers, M. Postma, E. van Weelden, S. Pratesi, M. M. Louwerse","doi":"10.1007/s10055-024-01035-7","DOIUrl":"https://doi.org/10.1007/s10055-024-01035-7","url":null,"abstract":"<p>The Overview Effect is a complex experience reported by astronauts after viewing Earth from space. Numerous accounts suggest that it leads to increased interconnectedness to other human beings and environmental awareness, comparable to self-transcendence. It can cause fundamental changes in mental models of the world, improved well-being, and stronger appreciation of, and responsibility for Earth. From a cognitive perspective, it is closely linked to the emotion of awe, possibly triggered by the overwhelming perceived vastness of the universe. Given that most research in the domain focuses on self-reports, little is known about potential neurophysiological markers of the Overview Effect. In the experiment reported here, participants viewed an immersive Virtual Reality simulation of a space journey while their brain activity was recorded using electroencephalography (EEG). Post-experimental self-reports confirmed they were able to experience the Overview Effect in the simulated environment. EEG recordings revealed lower spectral power in beta and gamma frequency bands during the defining moments of the Overview Effect. The decrease in spectral power can be associated with reduced mental processing, and a disruption of known mental structures in this context, thereby providing more evidence for the cognitive effects of the experience.</p>","PeriodicalId":23727,"journal":{"name":"Virtual Reality","volume":"245 1","pages":""},"PeriodicalIF":4.2,"publicationDate":"2024-07-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141781332","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}