{"title":"Between immersion and usability: A comparative study of 2D and mixed reality interfaces for remote music making","authors":"Alberto Boem, Matteo Tomasetti, Luca Turchet","doi":"10.1016/j.ijhcs.2025.103586","DOIUrl":null,"url":null,"abstract":"<div><div>Networked Music Performance (NMP) systems have traditionally focused on optimizing audio transmission for remote collaboration, often relying on basic 2D video feeds that lack the spatial awareness and visual cues essential for coordinated musical performance. While Mixed Reality (MR) technology offers promising enhancements for musical collaboration, its integration into NMP systems remains understudied—particularly regarding the balance between immersion and usability. This study addresses that gap by comparing traditional 2D video conferencing setups with MR environments featuring point cloud representations and spatial audio. Using a simulation-based approach with pre-recorded stimuli, we conducted an experimental study with professional electric guitar players (N = 16) to examine how both setups affect musical performance, movement, and sense of presence, combining quantitative motion analysis with qualitative feedback. Head movement analysis revealed significantly reduced motion in MR compared to 2D conditions (p <span><math><mo><</mo></math></span> 0.001), while detrended fluctuation analysis indicated more structured movement patterns in MR (p <span><math><mo><</mo></math></span> 0.01). Post-task evaluations showed that although MR enhanced immersion and presence (p <span><math><mo><</mo></math></span> 0.001), participants reported greater performance coherence and fewer technical interferences in the 2D condition (p <span><math><mo><</mo></math></span> 0.001). Spatial audio had minimal impact on user experience, with visual elements drawing most of the participants’ attention. These findings underscore a fundamental tension between immersion and physical expressiveness in MR-based music performance, suggesting that future systems should prioritize ergonomics alongside technological innovation. This research contributes to a deeper understanding of the potential and limitations of MR in musical collaboration within the emerging Musical Metaverse.</div></div>","PeriodicalId":54955,"journal":{"name":"International Journal of Human-Computer Studies","volume":"203 ","pages":"Article 103586"},"PeriodicalIF":5.1000,"publicationDate":"2025-07-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Human-Computer Studies","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1071581925001430","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, CYBERNETICS","Score":null,"Total":0}
引用次数: 0
Abstract
Networked Music Performance (NMP) systems have traditionally focused on optimizing audio transmission for remote collaboration, often relying on basic 2D video feeds that lack the spatial awareness and visual cues essential for coordinated musical performance. While Mixed Reality (MR) technology offers promising enhancements for musical collaboration, its integration into NMP systems remains understudied—particularly regarding the balance between immersion and usability. This study addresses that gap by comparing traditional 2D video conferencing setups with MR environments featuring point cloud representations and spatial audio. Using a simulation-based approach with pre-recorded stimuli, we conducted an experimental study with professional electric guitar players (N = 16) to examine how both setups affect musical performance, movement, and sense of presence, combining quantitative motion analysis with qualitative feedback. Head movement analysis revealed significantly reduced motion in MR compared to 2D conditions (p 0.001), while detrended fluctuation analysis indicated more structured movement patterns in MR (p 0.01). Post-task evaluations showed that although MR enhanced immersion and presence (p 0.001), participants reported greater performance coherence and fewer technical interferences in the 2D condition (p 0.001). Spatial audio had minimal impact on user experience, with visual elements drawing most of the participants’ attention. These findings underscore a fundamental tension between immersion and physical expressiveness in MR-based music performance, suggesting that future systems should prioritize ergonomics alongside technological innovation. This research contributes to a deeper understanding of the potential and limitations of MR in musical collaboration within the emerging Musical Metaverse.
期刊介绍:
The International Journal of Human-Computer Studies publishes original research over the whole spectrum of work relevant to the theory and practice of innovative interactive systems. The journal is inherently interdisciplinary, covering research in computing, artificial intelligence, psychology, linguistics, communication, design, engineering, and social organization, which is relevant to the design, analysis, evaluation and application of innovative interactive systems. Papers at the boundaries of these disciplines are especially welcome, as it is our view that interdisciplinary approaches are needed for producing theoretical insights in this complex area and for effective deployment of innovative technologies in concrete user communities.
Research areas relevant to the journal include, but are not limited to:
• Innovative interaction techniques
• Multimodal interaction
• Speech interaction
• Graphic interaction
• Natural language interaction
• Interaction in mobile and embedded systems
• Interface design and evaluation methodologies
• Design and evaluation of innovative interactive systems
• User interface prototyping and management systems
• Ubiquitous computing
• Wearable computers
• Pervasive computing
• Affective computing
• Empirical studies of user behaviour
• Empirical studies of programming and software engineering
• Computer supported cooperative work
• Computer mediated communication
• Virtual reality
• Mixed and augmented Reality
• Intelligent user interfaces
• Presence
...