Tianyi Xiao , Kevin Gonyop Kim , Jakub Krukar , Rajasirpi Subramaniyan , Peter Kiefer , Angela Schwering , Martin Raubal
{"title":"VResin: Externalizing spatial memory into 3D sketch maps","authors":"Tianyi Xiao , Kevin Gonyop Kim , Jakub Krukar , Rajasirpi Subramaniyan , Peter Kiefer , Angela Schwering , Martin Raubal","doi":"10.1016/j.ijhcs.2024.103322","DOIUrl":null,"url":null,"abstract":"<div><p>An intuitive way to externalize spatial memory is to sketch it. Compared to traditional paper-based sketches, virtual reality (VR) creates new opportunities to investigate the 3D aspect of spatial memory as it empowers users to express 3D information on a 3D interface directly. The goal of this study is to design a 3D sketch mapping tool for researchers and non-expert users without sketching expertise that enables externalizing memories of spatial information after some 3D-critical tasks. There exist 3D sketching tools using VR, but there are two issues with the current mid-air 3D sketching approach: (1) distortion of sketches due to depth perception errors and (2) increased cognitive and sensorimotor demands due to an increased degree of freedom and absence of physical support. To address these problems, we implemented VResin, a novel sketching interface that synergizes 3D mid-air sketching with 2D surface sketching to scaffold 3D sketching into a layer-by-layer process. An experimental study with 48 participants on multi-layer building scenarios showed that VResin supports users in creating less distorted sketches while maintaining the level of completeness and generalization compared to mid-air sketching in VR. We also demonstrate the potential applications that can benefit from 3D sketch maps and the suitability of VResin for a variety of building shapes.</p></div>","PeriodicalId":54955,"journal":{"name":"International Journal of Human-Computer Studies","volume":null,"pages":null},"PeriodicalIF":5.3000,"publicationDate":"2024-06-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S107158192400106X/pdfft?md5=c7b98f3783f62575f93cd1ebb3ae5516&pid=1-s2.0-S107158192400106X-main.pdf","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Human-Computer Studies","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S107158192400106X","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, CYBERNETICS","Score":null,"Total":0}
引用次数: 0
Abstract
An intuitive way to externalize spatial memory is to sketch it. Compared to traditional paper-based sketches, virtual reality (VR) creates new opportunities to investigate the 3D aspect of spatial memory as it empowers users to express 3D information on a 3D interface directly. The goal of this study is to design a 3D sketch mapping tool for researchers and non-expert users without sketching expertise that enables externalizing memories of spatial information after some 3D-critical tasks. There exist 3D sketching tools using VR, but there are two issues with the current mid-air 3D sketching approach: (1) distortion of sketches due to depth perception errors and (2) increased cognitive and sensorimotor demands due to an increased degree of freedom and absence of physical support. To address these problems, we implemented VResin, a novel sketching interface that synergizes 3D mid-air sketching with 2D surface sketching to scaffold 3D sketching into a layer-by-layer process. An experimental study with 48 participants on multi-layer building scenarios showed that VResin supports users in creating less distorted sketches while maintaining the level of completeness and generalization compared to mid-air sketching in VR. We also demonstrate the potential applications that can benefit from 3D sketch maps and the suitability of VResin for a variety of building shapes.
期刊介绍:
The International Journal of Human-Computer Studies publishes original research over the whole spectrum of work relevant to the theory and practice of innovative interactive systems. The journal is inherently interdisciplinary, covering research in computing, artificial intelligence, psychology, linguistics, communication, design, engineering, and social organization, which is relevant to the design, analysis, evaluation and application of innovative interactive systems. Papers at the boundaries of these disciplines are especially welcome, as it is our view that interdisciplinary approaches are needed for producing theoretical insights in this complex area and for effective deployment of innovative technologies in concrete user communities.
Research areas relevant to the journal include, but are not limited to:
• Innovative interaction techniques
• Multimodal interaction
• Speech interaction
• Graphic interaction
• Natural language interaction
• Interaction in mobile and embedded systems
• Interface design and evaluation methodologies
• Design and evaluation of innovative interactive systems
• User interface prototyping and management systems
• Ubiquitous computing
• Wearable computers
• Pervasive computing
• Affective computing
• Empirical studies of user behaviour
• Empirical studies of programming and software engineering
• Computer supported cooperative work
• Computer mediated communication
• Virtual reality
• Mixed and augmented Reality
• Intelligent user interfaces
• Presence
...