{"title":"Multi-Device Collaboration in Virtual Environments","authors":"S. Marks, David White","doi":"10.1145/3385378.3385381","DOIUrl":null,"url":null,"abstract":"We present a multi-device collaboration principle for virtual environments, using a combination of virtual and augmented reality (VR/AR) technology, used in the context of two educational applications, a virtual nasal cavity, and a visualisation of earthquake data. A head-mounted display (HMD) and a 3D-tracked tablet create two views of a shared virtual space. This allows two users to collaborate, utilising the strengths of each of the two technologies, e.g., intuitive spatial navigation and interaction in VR, and touch control of the visualisation parameters via the AR tablet. Touch gestures on the tablet are translated into a pointer ray in VR, so the users can easily indicate spatial features. The underlying networking infrastructure allows for an extension of this application to more than two users and across different rendering platforms.","PeriodicalId":169609,"journal":{"name":"Proceedings of the 2020 4th International Conference on Virtual and Augmented Reality Simulations","volume":"251 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-02-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2020 4th International Conference on Virtual and Augmented Reality Simulations","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3385378.3385381","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 6
Abstract
We present a multi-device collaboration principle for virtual environments, using a combination of virtual and augmented reality (VR/AR) technology, used in the context of two educational applications, a virtual nasal cavity, and a visualisation of earthquake data. A head-mounted display (HMD) and a 3D-tracked tablet create two views of a shared virtual space. This allows two users to collaborate, utilising the strengths of each of the two technologies, e.g., intuitive spatial navigation and interaction in VR, and touch control of the visualisation parameters via the AR tablet. Touch gestures on the tablet are translated into a pointer ray in VR, so the users can easily indicate spatial features. The underlying networking infrastructure allows for an extension of this application to more than two users and across different rendering platforms.