Yu Xia , Shulong Yan , Mengying Jiang , Zipporah Brown
{"title":"Supporting learning in synchronous collaborative game design in virtual worlds: A synergy between technological and pedagogical considerations","authors":"Yu Xia , Shulong Yan , Mengying Jiang , Zipporah Brown","doi":"10.1016/j.cexr.2025.100110","DOIUrl":"10.1016/j.cexr.2025.100110","url":null,"abstract":"<div><div>Online collaboration has been ever present in our life and the same goes with collaborative learning in virtual worlds. However, little research has zoomed in on this type of collaborative learning context and we contribute to the understanding of technological infrastructure and pedagogical strategies to support collaborative learning in such a context. Taking a socio-material lens, we discuss four essential considerations in supporting collaborative design in virtual learning environments: social artifacts, togetherness, synchronicity, and multilevel participation. Cases were selected from a virtual makerspace offered in the summer of 2023 to illustrate the entanglement of technology and pedagogy. We then discuss in detail the technological and pedagogical considerations associated with these dimensions. Our framework provides concrete guidance for educators and researchers who are interested in offering or researching collaborative learning in virtual worlds.</div></div>","PeriodicalId":100320,"journal":{"name":"Computers & Education: X Reality","volume":"7 ","pages":"Article 100110"},"PeriodicalIF":0.0,"publicationDate":"2025-07-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144564071","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Tao Lin , Quanhao Gan , Fuxi Ouyang , Yiming Luo , Yushan Pan , Yushi Li , Shaoyu Cai
{"title":"AromaCanvas: A wearable olfactory display for Chinese painting appreciation and learning in virtual reality","authors":"Tao Lin , Quanhao Gan , Fuxi Ouyang , Yiming Luo , Yushan Pan , Yushi Li , Shaoyu Cai","doi":"10.1016/j.cexr.2025.100109","DOIUrl":"10.1016/j.cexr.2025.100109","url":null,"abstract":"<div><div>In this paper, we present AromaCanvas, a wearable olfactory display designed to enhance immersive appreciation and exploration of Chinese paintings in virtual reality (VR). AromaCanvas integrates two piezoelectric-based transducers into a vest, enabling scent delivery around the user's shoulders with controllable intensities activated through finger gesture interactions. Users can engage with Chinese paintings by pointing at different elements, such as woods or flowers, to trigger corresponding scents at varying intensities, creating a highly immersive and engaging VR art experience. We conducted two user-perception experiments to investigate how users perceive scents in virtual environments using our olfactory system. The first experiment explored human perception under different actuation factors, including the actuator distances, actuated intensities, and scent types, using piezoelectric-based transducers. Results revealed that perceived scent intensity varied across these factors, allowing us to optimize AromaCanvas for the most energy-efficient design. The second experiment evaluated the VR experience and demonstrated that AromaCanvas significantly enhanced users' sense of presence, usability, and overall experience of appreciating and learning about Chinese paintings in VR, outperforming the conventional VR system.</div></div>","PeriodicalId":100320,"journal":{"name":"Computers & Education: X Reality","volume":"7 ","pages":"Article 100109"},"PeriodicalIF":0.0,"publicationDate":"2025-07-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144536178","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Anna C.M. Queiroz , Jeremy N. Bailenson , Kristen Pilner Blair , Daniel L. Schwartz , Candace Thille , Anthony D. Wagner
{"title":"Self-review and feedback in virtual reality dialogues increase language markers of personal and emotional expression in an empathetic communication training experience","authors":"Anna C.M. Queiroz , Jeremy N. Bailenson , Kristen Pilner Blair , Daniel L. Schwartz , Candace Thille , Anthony D. Wagner","doi":"10.1016/j.cexr.2025.100108","DOIUrl":"10.1016/j.cexr.2025.100108","url":null,"abstract":"<div><div>Technological advancements have transformed how people communicate, work, and develop critical skills, especially in leadership. These changes will require nuanced skills, particularly empathetic communication, which is pivotal in managing teams and maintaining high performance in distributed work environments. Virtual reality has shown encouraging results in developing empathy and communication skills. Moreover, natural language processing techniques can provide a deeper understanding of communication patterns and nuances. However, there is still much to learn about how virtual reality can support active, empathetic communication training in the workplace. Hence, we first developed a virtual reality experience where participants could embody the manager and the employee in a performance review meeting. Then, we investigated the effects of reviewing one's performance and receiving feedback in a virtual reality perspective-taking task, compared to not reviewing or receiving feedback. The study was pre-registered and followed a pre-and post-test study design. One hundred nine participants were randomly assigned to one of the three conditions: perspective-taking, perspective-taking with self-review, or perspective-taking with self-review and feedback. Empathetic communication skills were measured through self-report measures, human-coded scoring of written and spoken behavior, and natural language processing. Results showed that receiving feedback while reviewing one's performance in a perspective-taking task increased emotional expressions in oral communication. Repeating the interaction a second time increased the use of the “I” pronoun and decreased the use of “you.” Improvement in empathetic communication was not linked to feeling concern for others. We discuss implications for theories of learning via media and implications for practitioners.</div></div>","PeriodicalId":100320,"journal":{"name":"Computers & Education: X Reality","volume":"7 ","pages":"Article 100108"},"PeriodicalIF":0.0,"publicationDate":"2025-07-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144523879","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Immersive virtual reality learning environments for higher education: A student acceptance study","authors":"Stefan Aufenanger , Jasmin Bastian , Glória Bastos , Maria Castelhano , Célia Dias-Ferreira , Emmanuel Fokides , Damianos Gavalas , Vlasios Kasapakis , Androniki Agelada , Apostolos Kostas , George Koutromanos , Gregory Makrides , Leonel Morgado , Daniela Pedrosa , Tomasz Szemberg , Alivizos Sofos , Justyna Szpond","doi":"10.1016/j.cexr.2025.100105","DOIUrl":"10.1016/j.cexr.2025.100105","url":null,"abstract":"<div><div>The study investigates the integration of Virtual Reality Learning Environments (VRLEs) in academic teaching through the EU-funded \"REVEALING\" project. Researchers from Cyprus, Germany, Greece, Poland, and Portugal developed and evaluated five different immersive VRLEs, each focusing on diverse educational topics, including ancient Greek technology, sea urchin measurements, linear algebra, and historical expeditions. The study aims to determine effective instructional design principles for VRLEs and assess students' acceptance and learning outcomes.</div><div>The VRLEs were designed based on literature-derived principles that emphasise ease of tool usage, authentic experiences, and continuous feedback. Students from the participating universities explored these VR environments, providing feedback through a standardized questionnaire on aspects like immersion, ease of use, motivation, and emotions.</div><div>Results show that most participants positively engaged with the VRLEs, reporting high motivation and positive emotional responses, particularly for experiences involving interactivity. However, challenges like motion sickness and technical issues were noted, especially at one institution. The findings suggest that immersive VR experiences can significantly enhance motivation and engagement, but their effectiveness depends on careful alignment with pedagogical goals, design quality, and user experience considerations.</div></div>","PeriodicalId":100320,"journal":{"name":"Computers & Education: X Reality","volume":"7 ","pages":"Article 100105"},"PeriodicalIF":0.0,"publicationDate":"2025-06-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144501021","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"heARtbeat: Augmented reality for teaching electrocardiogram electrode placement","authors":"Cathal Breen , William Hurst","doi":"10.1016/j.cexr.2025.100107","DOIUrl":"10.1016/j.cexr.2025.100107","url":null,"abstract":"<div><div>The 12-lead Electrocardiogram (ECG) is currently one of the most widely used, operator-dependent clinical skills in healthcare. To enable correct interpretation of ECG recordings, electrodes must be placed in specific anatomic locations on the chest, legs and arms. Traditional learning practices of ECG electrode placement are not standardized and burdening on faculty staff and resources. Thus, misplacement of ECG electrodes affects the accuracy of interpretation and referral to medical treatments. Augmented Reality (AR) offers a safe and scalable metric for students to learn and practice various procedures and skills by creating hyper-realistic simulations. Yet, within the healthcare domain, the main applications of AR are in the field of surgery, rehabilitation, and anatomy teaching. Therefore, this article documents the design of an AR application for teaching ECG electrode placement and the tool's Alpha evaluation by means of an expert validation and case study trial. Users of the application found it to be Inspiring (5.0) and Educational (4.9) but less effective in terms of User Friendliness (2.8) on a 7-point Likert scale. Regarding the educational potential, 57 % of the end users found the tool to increase their knowledge of ECG placement.</div></div>","PeriodicalId":100320,"journal":{"name":"Computers & Education: X Reality","volume":"7 ","pages":"Article 100107"},"PeriodicalIF":0.0,"publicationDate":"2025-06-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144490612","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Heide K. Lukosch , Cara Swit , Rene Novak , E. Jayne White
{"title":"Exploring virtual encounters in early childhood education: Results of a pilot study","authors":"Heide K. Lukosch , Cara Swit , Rene Novak , E. Jayne White","doi":"10.1016/j.cexr.2025.100104","DOIUrl":"10.1016/j.cexr.2025.100104","url":null,"abstract":"<div><div>Interpersonal skills such as empathy, intuition and sensing, emotional intelligence, and effective communication, are crucial for teachers working with infants (aged birth to 2 year) in Early Childhood Education and Care (ECEC). However, due to the intimate and vulnerable nature of this relationship for infants, opportunities for students to rehearse these skills in real life ECE contexts are limited. We co-designed an immersive virtual reality (VR) environment to simulate an ECEC context, with a virtual baby prototype, furniture such as a changing table and a cot, and toys a user could interact with. A pilot user study tested its efficacy with 17 participants made up of 12 students of a tertiary ECE program and 5 qualified ECE teachers. A questionnaire was used to collect data on usability, experience, and overall feedback on the VR baby experience. Results show that - while the majority of the participants appraised the audio-visual component of the VR environment, the limited haptic feedback and interaction options were a source of fear and discomfort. Participants reported to being immersed in the learning environment, but would appreciate more realistic feedback mechanisms like touch and breath. We suggest that further research looks into the effect of advanced haptic feedback in VR when used for learning in ECE.</div></div>","PeriodicalId":100320,"journal":{"name":"Computers & Education: X Reality","volume":"7 ","pages":"Article 100104"},"PeriodicalIF":0.0,"publicationDate":"2025-06-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144481464","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Alex Barrett , Fengfeng Ke , Nuodi Zhang , Zlatko Sokolikj
{"title":"Implementation fidelity of an evidence-centered maker education intervention in a virtual world for neurodiverse adolescents","authors":"Alex Barrett , Fengfeng Ke , Nuodi Zhang , Zlatko Sokolikj","doi":"10.1016/j.cexr.2025.100106","DOIUrl":"10.1016/j.cexr.2025.100106","url":null,"abstract":"<div><div>Intervention research is frequently hindered by a lack of attention to implementation fidelity. The success or failure of treatments relies heavily on whether they were implemented as intended. This is particularly important when studying vulnerable populations. This paper reports on the implementation fidelity of a virtual world (VW) intervention designed for neurodiverse individuals to exercise computational thinking skills through making. Twelve neurodiverse participants partook in the VW-based program, totaling 108 contact hours. Fidelity of implementation was operationalized along the dimensions of adherence to design, exposure, quality, and participant responsiveness. Results suggest that the program was implemented with high fidelity, with specific results indicating elements of program implementation that are particularly important when considering the human-computer interaction between neurodiverse populations and VWs in educational contexts. This paper provides valuable insight into the design and implementation of VW technology in maker education interventions involving neurodiverse populations.</div></div>","PeriodicalId":100320,"journal":{"name":"Computers & Education: X Reality","volume":"7 ","pages":"Article 100106"},"PeriodicalIF":0.0,"publicationDate":"2025-06-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144338765","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Sara Khorasani , Stephain Hsu , Rui Guan , Jorge Goncalves , Andrew Irlitti , Jarrod Knibbe , Eduardo Velloso
{"title":"Pause for success: Harnessing interaction delay and target selection difficulty in VR hands-on learning environments","authors":"Sara Khorasani , Stephain Hsu , Rui Guan , Jorge Goncalves , Andrew Irlitti , Jarrod Knibbe , Eduardo Velloso","doi":"10.1016/j.cexr.2025.100103","DOIUrl":"10.1016/j.cexr.2025.100103","url":null,"abstract":"<div><div>Human-computer interaction (HCI) theory suggests that we should minimize interaction delays and reduce target selection difficulty to optimise performance. However, in learning scenarios, delays have been shown to cause ‘forced learning’ and difficulty can be an intrinsic motivator. Any interplay between delays, forced learning, difficulty, and the embodied, immersive explo-ration style of virtual reality (VR) remains poorly understood. We study the impact of delay and target selection difficulty on learning outcomes in VR. Using a VR makerspace training module with a 2x2 factorial, mixed-methods approach, we analyze the learning data from 124 participants who interacted with either a 5-s or zero delay post target selection, and <em>Easy</em> versus <em>Hard</em> target selection difficulties. The findings reveal that incorporating a 5-s delay post-interaction led to superior learning outcomes, providing users with more time to process and rehearse information. In contrast, altering the target selection difficulty showed negligible effects on learning outcomes, with participants reporting a simultaneous increase in engagement and distraction from the learning content. This research challenges conventional HCI theories within a VR context, suggesting potential educational benefits from strategically incorporated interaction delays.</div></div>","PeriodicalId":100320,"journal":{"name":"Computers & Education: X Reality","volume":"7 ","pages":"Article 100103"},"PeriodicalIF":0.0,"publicationDate":"2025-06-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144335770","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Rushil Mojidra , Jian Li , Alexander Crane , Sdiq Anwar Taher
{"title":"Immersive learning for structural analysis through mobile augmented reality and the MOLA structural kit","authors":"Rushil Mojidra , Jian Li , Alexander Crane , Sdiq Anwar Taher","doi":"10.1016/j.cexr.2025.100100","DOIUrl":"10.1016/j.cexr.2025.100100","url":null,"abstract":"<div><div>Structural analysis is a foundational course in civil and mechanical engineering programs, essential for understanding of how structures respond to various loads. However, traditional teaching methods, constrained by two-dimensional representations, often fall short in conveying three-dimensional structural behavior. In response, this research introduces an innovative augmented reality (AR) application for mobile and tablet devices, designed to provide real-time visual feedback on the structural behavior of physical models. The application enhances students understanding of structural concepts by offering detailed and tangible insights into deflections, reactions, and the development of shear forces and moments within structural elements under applied loads. Users can apply loads in various directions and immediately visualize the corresponding structural responses, fostering a deeper comprehension of complex structural systems. By projecting real-time feedback directly onto physical models, the AR application creates an interactive and immersive learning experience. To evaluate the effectiveness of the application, a case study was conducted with two groups of students: a control group and an experimental group. Pretest and posttest assessments were used to measure learning outcomes, while a comprehensive survey captured students' attitudes and feedback. The results suggest that the AR application consistently enhances learning outcomes across all students, thanks to its interactive environment, real-time visual feedback, and clear presentation of complex concepts. Additionally, the survey revealed strong student acceptance of the AR technology, high levels of engagement, and a positive outlook on its future use in teaching structural analysis.</div></div>","PeriodicalId":100320,"journal":{"name":"Computers & Education: X Reality","volume":"7 ","pages":"Article 100100"},"PeriodicalIF":0.0,"publicationDate":"2025-06-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144272384","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Group dynamics in collaborative learning: Impact of emergent and scripted roles in tangible mobile augmented reality games","authors":"Gardeli Anna, Vosinakis Spyros","doi":"10.1016/j.cexr.2025.100102","DOIUrl":"10.1016/j.cexr.2025.100102","url":null,"abstract":"<div><div>This paper presents a study investigating the use of Tangible Mobile Augmented Reality (TMAR) for synchronous collaborative learning, with a focus on the influence of roles on group dynamics across various group compositions. This research addresses a gap in current understanding of how role structures affect problem-solving and collaborative behaviors in TMAR-based learning environments. A quasi-experimental research approach within-subjects design was used. The study involved 23 elementary school students who participated in an educational game designed to develop computational thinking skills. These students worked in small groups, using mobile devices and physical artifacts as markers to solve problems. Two modes of collaboration were examined: (1) emergent-role collaboration, where participants self-organized, and (2) scripted-role collaboration, where scripted roles were assigned. Qualitative content analysis was conducted to interpret qualitative data from structured observations and student feedback. Findings suggest that the effectiveness of TMAR-based collaboration depends on the group's underlying goal orientation. In goal-aligned groups, roles emerged naturally and supported productive interaction, while in less cohesive groups, scripted roles provided the necessary structure. <span>Furthermore</span>, the tangible features of TMAR show evidence of further support for role distribution and collaborative problem-solving when used appropriately. These insights contribute to the broader field of collaborative learning, computational thinking, and the application of TMAR in formal education settings.</div></div>","PeriodicalId":100320,"journal":{"name":"Computers & Education: X Reality","volume":"7 ","pages":"Article 100102"},"PeriodicalIF":0.0,"publicationDate":"2025-06-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144263625","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}