{"title":"REACH: Extending reality for distributed collaborative making","authors":"Casey Smith, Mike Tissenbaum","doi":"10.1016/j.cexr.2025.100111","DOIUrl":"10.1016/j.cexr.2025.100111","url":null,"abstract":"<div><div>Makerspaces allow students to engage in 21st-century skills such as critical thinking, collaboration, and problem-solving through construction and sharing of projects that are personally meaningful. Based in constructivist and socio-cultural learning theories, these student-centered spaces support learners in a community of practice as they construct knowledge through shared work with peers. However, uneven access to local expertise and peer support can make equitable participation in maker activities challenging. In an effort to expand the benefits of co-located making to interactions at a distance, this study investigates the collaborative affordances of a camera-projector device, REACH (Remote Embodiment for Augmented Collaborative Help) that augments user's workspaces through projection of artifacts for shared viewing and gesturing. This technology enhances the physicality of learning across distances, allowing students to discuss, adjust, and explore artifacts together without a common physical space. REACH's innovative approach to gesture-driven collaboration supports cognitive and communication processes, allowing students to deepen their understanding through remote yet tangible interaction. This aligns with ongoing educational reform efforts to adapt teaching practices and tools to address the complexities of remote and digital learning, ultimately enhancing access to the benefits of making.</div></div>","PeriodicalId":100320,"journal":{"name":"Computers & Education: X Reality","volume":"7 ","pages":"Article 100111"},"PeriodicalIF":0.0,"publicationDate":"2025-07-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144739255","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Miriam Mulders, Kristian H. Träg, Lilly Kaninski, Lara Rahner
{"title":"Past lives, present learners: Future directions for history education in virtual reality","authors":"Miriam Mulders, Kristian H. Träg, Lilly Kaninski, Lara Rahner","doi":"10.1016/j.cexr.2025.100114","DOIUrl":"10.1016/j.cexr.2025.100114","url":null,"abstract":"<div><div>This study investigates the relationship between presence and learning outcomes in Virtual Reality (VR) environments, with a focus on both cognitive and affective learning. Using the <em>Anne Frank VR House</em>, a virtual replica of a hiding place for a group of Jewish people during World War II, 74 university students explored how the feeling of presence affects knowledge acquisition and perspective-taking. The results showed a significant positive correlation between presence and perspective-taking, but no effect on knowledge acquisition, meaning that a higher sense of presence predicted higher perspective-taking, while knowledge scores did not. These findings highlight VR's potential to create a sense of presence and thus foster emotional engagement in history education, suggesting that empathy-driven learning may be an effective way to engage students with complex socio-political issues beyond factual knowledge.</div></div>","PeriodicalId":100320,"journal":{"name":"Computers & Education: X Reality","volume":"7 ","pages":"Article 100114"},"PeriodicalIF":0.0,"publicationDate":"2025-07-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144665610","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Ioannis Vrellis , Tassos Anastasios Mikropoulos , George Koutromanos
{"title":"Teachers’ experience and situation awareness of airborne disease transmission through immersive augmented reality","authors":"Ioannis Vrellis , Tassos Anastasios Mikropoulos , George Koutromanos","doi":"10.1016/j.cexr.2025.100113","DOIUrl":"10.1016/j.cexr.2025.100113","url":null,"abstract":"<div><div>The COVID-19 pandemic created the need to raise awareness about airborne disease transmission via respiratory particles. Immersive Augmented Reality (AR) could increase Situation Awareness (SA) about this invisible phenomenon. Teachers play an important role in handling health emergencies by providing health literacy and promoting protective behaviors and thus could benefit from this technology. The aim of this study was threefold: (a) to develop an immersive educational AR application that creates awareness about airborne disease transmission, (b) to empirically evaluate its effectiveness in terms of SA and user experience among teachers and (c) to investigate design issues and more specifically the role of color of the visualized respiratory particles. Two versions of the application were created for Magic Leap 1 AR glasses representing respiratory particles as red or blue spherical shapes. An empirical study with forty-eight educators was carried out to measure SA and user experience in terms of presence, simulator sickness, workload, and satisfaction. The results showed that the application created high levels of overall SA for both colors. Presence and satisfaction were very high regardless of color and positively correlated. Simulator sickness and workload were low regardless of color and were not correlated with SA or presence. Participants’ comments confirmed their high levels of presence, SA and satisfaction. In terms of gender differences, women scored slightly higher in SA but were more vulnerable to simulator sickness. Overall, results imply that immersive AR can create high SA about airborne disease transmission while providing a positive experience.</div></div>","PeriodicalId":100320,"journal":{"name":"Computers & Education: X Reality","volume":"7 ","pages":"Article 100113"},"PeriodicalIF":0.0,"publicationDate":"2025-07-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144661985","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Enhancing textile heritage engagement through generative AI-based virtual assistants in virtual reality museums","authors":"Pakinee Ariya , Songpon Khanchai , Kannikar Intawong , Kitti Puritat","doi":"10.1016/j.cexr.2025.100112","DOIUrl":"10.1016/j.cexr.2025.100112","url":null,"abstract":"<div><div>This study investigates how generative AI-based virtual assistants embedded within immersive virtual reality (VR) environments can enhance user engagement and cultural learning in virtual museums. Situated at the Wieng Yong House Museum in Thailand, the research addresses the challenge of preserving and promoting textile heritage in the digital age. The study aims to design, implement, and evaluate an AI-driven virtual docent capable of delivering personalized, multilingual, and real-time cultural information through interactive voice-based engagement. Using a purposive sampling technique, 25 university students participated in a convergent parallel mixed-methods study combining structured questionnaires and open-ended feedback. Quantitative findings from structured questionnaires revealed high user satisfaction, with mean scores of 4.40 for visual quality and 4.20 for ease of interaction, while response latency and voice clarity received lower ratings of 3.36 and 3.62 respectively, indicating areas for improvement. Qualitative analysis revealed four key themes: user experience with the system, communication quality, response effectiveness, and suggestions for improvement. The results demonstrate both the transformative potential and current limitations of generative AI in digital heritage settings. This study contributes to the development of more inclusive and engaging virtual museum experiences for the teaching and learning of cultural heritage, offering practical design insights for educators, curators, and developers.</div></div>","PeriodicalId":100320,"journal":{"name":"Computers & Education: X Reality","volume":"7 ","pages":"Article 100112"},"PeriodicalIF":0.0,"publicationDate":"2025-07-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144614217","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Yu Xia , Shulong Yan , Mengying Jiang , Zipporah Brown
{"title":"Supporting learning in synchronous collaborative game design in virtual worlds: A synergy between technological and pedagogical considerations","authors":"Yu Xia , Shulong Yan , Mengying Jiang , Zipporah Brown","doi":"10.1016/j.cexr.2025.100110","DOIUrl":"10.1016/j.cexr.2025.100110","url":null,"abstract":"<div><div>Online collaboration has been ever present in our life and the same goes with collaborative learning in virtual worlds. However, little research has zoomed in on this type of collaborative learning context and we contribute to the understanding of technological infrastructure and pedagogical strategies to support collaborative learning in such a context. Taking a socio-material lens, we discuss four essential considerations in supporting collaborative design in virtual learning environments: social artifacts, togetherness, synchronicity, and multilevel participation. Cases were selected from a virtual makerspace offered in the summer of 2023 to illustrate the entanglement of technology and pedagogy. We then discuss in detail the technological and pedagogical considerations associated with these dimensions. Our framework provides concrete guidance for educators and researchers who are interested in offering or researching collaborative learning in virtual worlds.</div></div>","PeriodicalId":100320,"journal":{"name":"Computers & Education: X Reality","volume":"7 ","pages":"Article 100110"},"PeriodicalIF":0.0,"publicationDate":"2025-07-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144564071","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Tao Lin , Quanhao Gan , Fuxi Ouyang , Yiming Luo , Yushan Pan , Yushi Li , Shaoyu Cai
{"title":"AromaCanvas: A wearable olfactory display for Chinese painting appreciation and learning in virtual reality","authors":"Tao Lin , Quanhao Gan , Fuxi Ouyang , Yiming Luo , Yushan Pan , Yushi Li , Shaoyu Cai","doi":"10.1016/j.cexr.2025.100109","DOIUrl":"10.1016/j.cexr.2025.100109","url":null,"abstract":"<div><div>In this paper, we present AromaCanvas, a wearable olfactory display designed to enhance immersive appreciation and exploration of Chinese paintings in virtual reality (VR). AromaCanvas integrates two piezoelectric-based transducers into a vest, enabling scent delivery around the user's shoulders with controllable intensities activated through finger gesture interactions. Users can engage with Chinese paintings by pointing at different elements, such as woods or flowers, to trigger corresponding scents at varying intensities, creating a highly immersive and engaging VR art experience. We conducted two user-perception experiments to investigate how users perceive scents in virtual environments using our olfactory system. The first experiment explored human perception under different actuation factors, including the actuator distances, actuated intensities, and scent types, using piezoelectric-based transducers. Results revealed that perceived scent intensity varied across these factors, allowing us to optimize AromaCanvas for the most energy-efficient design. The second experiment evaluated the VR experience and demonstrated that AromaCanvas significantly enhanced users' sense of presence, usability, and overall experience of appreciating and learning about Chinese paintings in VR, outperforming the conventional VR system.</div></div>","PeriodicalId":100320,"journal":{"name":"Computers & Education: X Reality","volume":"7 ","pages":"Article 100109"},"PeriodicalIF":0.0,"publicationDate":"2025-07-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144536178","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Anna C.M. Queiroz , Jeremy N. Bailenson , Kristen Pilner Blair , Daniel L. Schwartz , Candace Thille , Anthony D. Wagner
{"title":"Self-review and feedback in virtual reality dialogues increase language markers of personal and emotional expression in an empathetic communication training experience","authors":"Anna C.M. Queiroz , Jeremy N. Bailenson , Kristen Pilner Blair , Daniel L. Schwartz , Candace Thille , Anthony D. Wagner","doi":"10.1016/j.cexr.2025.100108","DOIUrl":"10.1016/j.cexr.2025.100108","url":null,"abstract":"<div><div>Technological advancements have transformed how people communicate, work, and develop critical skills, especially in leadership. These changes will require nuanced skills, particularly empathetic communication, which is pivotal in managing teams and maintaining high performance in distributed work environments. Virtual reality has shown encouraging results in developing empathy and communication skills. Moreover, natural language processing techniques can provide a deeper understanding of communication patterns and nuances. However, there is still much to learn about how virtual reality can support active, empathetic communication training in the workplace. Hence, we first developed a virtual reality experience where participants could embody the manager and the employee in a performance review meeting. Then, we investigated the effects of reviewing one's performance and receiving feedback in a virtual reality perspective-taking task, compared to not reviewing or receiving feedback. The study was pre-registered and followed a pre-and post-test study design. One hundred nine participants were randomly assigned to one of the three conditions: perspective-taking, perspective-taking with self-review, or perspective-taking with self-review and feedback. Empathetic communication skills were measured through self-report measures, human-coded scoring of written and spoken behavior, and natural language processing. Results showed that receiving feedback while reviewing one's performance in a perspective-taking task increased emotional expressions in oral communication. Repeating the interaction a second time increased the use of the “I” pronoun and decreased the use of “you.” Improvement in empathetic communication was not linked to feeling concern for others. We discuss implications for theories of learning via media and implications for practitioners.</div></div>","PeriodicalId":100320,"journal":{"name":"Computers & Education: X Reality","volume":"7 ","pages":"Article 100108"},"PeriodicalIF":0.0,"publicationDate":"2025-07-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144523879","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Immersive virtual reality learning environments for higher education: A student acceptance study","authors":"Stefan Aufenanger , Jasmin Bastian , Glória Bastos , Maria Castelhano , Célia Dias-Ferreira , Emmanuel Fokides , Damianos Gavalas , Vlasios Kasapakis , Androniki Agelada , Apostolos Kostas , George Koutromanos , Gregory Makrides , Leonel Morgado , Daniela Pedrosa , Tomasz Szemberg , Alivizos Sofos , Justyna Szpond","doi":"10.1016/j.cexr.2025.100105","DOIUrl":"10.1016/j.cexr.2025.100105","url":null,"abstract":"<div><div>The study investigates the integration of Virtual Reality Learning Environments (VRLEs) in academic teaching through the EU-funded \"REVEALING\" project. Researchers from Cyprus, Germany, Greece, Poland, and Portugal developed and evaluated five different immersive VRLEs, each focusing on diverse educational topics, including ancient Greek technology, sea urchin measurements, linear algebra, and historical expeditions. The study aims to determine effective instructional design principles for VRLEs and assess students' acceptance and learning outcomes.</div><div>The VRLEs were designed based on literature-derived principles that emphasise ease of tool usage, authentic experiences, and continuous feedback. Students from the participating universities explored these VR environments, providing feedback through a standardized questionnaire on aspects like immersion, ease of use, motivation, and emotions.</div><div>Results show that most participants positively engaged with the VRLEs, reporting high motivation and positive emotional responses, particularly for experiences involving interactivity. However, challenges like motion sickness and technical issues were noted, especially at one institution. The findings suggest that immersive VR experiences can significantly enhance motivation and engagement, but their effectiveness depends on careful alignment with pedagogical goals, design quality, and user experience considerations.</div></div>","PeriodicalId":100320,"journal":{"name":"Computers & Education: X Reality","volume":"7 ","pages":"Article 100105"},"PeriodicalIF":0.0,"publicationDate":"2025-06-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144501021","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"heARtbeat: Augmented reality for teaching electrocardiogram electrode placement","authors":"Cathal Breen , William Hurst","doi":"10.1016/j.cexr.2025.100107","DOIUrl":"10.1016/j.cexr.2025.100107","url":null,"abstract":"<div><div>The 12-lead Electrocardiogram (ECG) is currently one of the most widely used, operator-dependent clinical skills in healthcare. To enable correct interpretation of ECG recordings, electrodes must be placed in specific anatomic locations on the chest, legs and arms. Traditional learning practices of ECG electrode placement are not standardized and burdening on faculty staff and resources. Thus, misplacement of ECG electrodes affects the accuracy of interpretation and referral to medical treatments. Augmented Reality (AR) offers a safe and scalable metric for students to learn and practice various procedures and skills by creating hyper-realistic simulations. Yet, within the healthcare domain, the main applications of AR are in the field of surgery, rehabilitation, and anatomy teaching. Therefore, this article documents the design of an AR application for teaching ECG electrode placement and the tool's Alpha evaluation by means of an expert validation and case study trial. Users of the application found it to be Inspiring (5.0) and Educational (4.9) but less effective in terms of User Friendliness (2.8) on a 7-point Likert scale. Regarding the educational potential, 57 % of the end users found the tool to increase their knowledge of ECG placement.</div></div>","PeriodicalId":100320,"journal":{"name":"Computers & Education: X Reality","volume":"7 ","pages":"Article 100107"},"PeriodicalIF":0.0,"publicationDate":"2025-06-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144490612","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Heide K. Lukosch , Cara Swit , Rene Novak , E. Jayne White
{"title":"Exploring virtual encounters in early childhood education: Results of a pilot study","authors":"Heide K. Lukosch , Cara Swit , Rene Novak , E. Jayne White","doi":"10.1016/j.cexr.2025.100104","DOIUrl":"10.1016/j.cexr.2025.100104","url":null,"abstract":"<div><div>Interpersonal skills such as empathy, intuition and sensing, emotional intelligence, and effective communication, are crucial for teachers working with infants (aged birth to 2 year) in Early Childhood Education and Care (ECEC). However, due to the intimate and vulnerable nature of this relationship for infants, opportunities for students to rehearse these skills in real life ECE contexts are limited. We co-designed an immersive virtual reality (VR) environment to simulate an ECEC context, with a virtual baby prototype, furniture such as a changing table and a cot, and toys a user could interact with. A pilot user study tested its efficacy with 17 participants made up of 12 students of a tertiary ECE program and 5 qualified ECE teachers. A questionnaire was used to collect data on usability, experience, and overall feedback on the VR baby experience. Results show that - while the majority of the participants appraised the audio-visual component of the VR environment, the limited haptic feedback and interaction options were a source of fear and discomfort. Participants reported to being immersed in the learning environment, but would appreciate more realistic feedback mechanisms like touch and breath. We suggest that further research looks into the effect of advanced haptic feedback in VR when used for learning in ECE.</div></div>","PeriodicalId":100320,"journal":{"name":"Computers & Education: X Reality","volume":"7 ","pages":"Article 100104"},"PeriodicalIF":0.0,"publicationDate":"2025-06-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144481464","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}