Lei Gao, Huidong Bai, M. Billinghurst, R. Lindeman
{"title":"基于混合视图界面的混合现实远程协作用户行为分析","authors":"Lei Gao, Huidong Bai, M. Billinghurst, R. Lindeman","doi":"10.1145/3441000.3441038","DOIUrl":null,"url":null,"abstract":"In this paper, we present a Mixed Reality (MR) remote collaboration system that enables hybrid view sharing from the local worker to the remote expert for real-time remote guidance. The local worker can share either a 2D first-person view, a 360° live view with a fixed viewpoint, or a 3D free view within a point-cloud reconstruction of the local environment. The remote expert can access these views and freely switch between them in Virtual Reality (VR) for better awareness of the local worker’s surroundings. We investigate the remote experts’ behaviours while using the hybrid view interface during typical pick-and-place remote guiding tasks. We found that the remote experts prefer to learn the local physical layout and search for the targets with a global perspective from the 3D free view. The results also showed that the experts chose to use the 360° live view with independent view control rather than the 2D first-person view with high-resolution to control the task procedures and check the local worker’s actions. Our study informs the viewpoint interface design for a MR remote collaboration system in various guiding scenarios.","PeriodicalId":265398,"journal":{"name":"Proceedings of the 32nd Australian Conference on Human-Computer Interaction","volume":"21 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-12-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"11","resultStr":"{\"title\":\"User Behaviour Analysis of Mixed Reality Remote Collaboration with a Hybrid View Interface\",\"authors\":\"Lei Gao, Huidong Bai, M. Billinghurst, R. Lindeman\",\"doi\":\"10.1145/3441000.3441038\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In this paper, we present a Mixed Reality (MR) remote collaboration system that enables hybrid view sharing from the local worker to the remote expert for real-time remote guidance. The local worker can share either a 2D first-person view, a 360° live view with a fixed viewpoint, or a 3D free view within a point-cloud reconstruction of the local environment. The remote expert can access these views and freely switch between them in Virtual Reality (VR) for better awareness of the local worker’s surroundings. We investigate the remote experts’ behaviours while using the hybrid view interface during typical pick-and-place remote guiding tasks. We found that the remote experts prefer to learn the local physical layout and search for the targets with a global perspective from the 3D free view. The results also showed that the experts chose to use the 360° live view with independent view control rather than the 2D first-person view with high-resolution to control the task procedures and check the local worker’s actions. Our study informs the viewpoint interface design for a MR remote collaboration system in various guiding scenarios.\",\"PeriodicalId\":265398,\"journal\":{\"name\":\"Proceedings of the 32nd Australian Conference on Human-Computer Interaction\",\"volume\":\"21 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-12-02\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"11\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 32nd Australian Conference on Human-Computer Interaction\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3441000.3441038\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 32nd Australian Conference on Human-Computer Interaction","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3441000.3441038","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
User Behaviour Analysis of Mixed Reality Remote Collaboration with a Hybrid View Interface
In this paper, we present a Mixed Reality (MR) remote collaboration system that enables hybrid view sharing from the local worker to the remote expert for real-time remote guidance. The local worker can share either a 2D first-person view, a 360° live view with a fixed viewpoint, or a 3D free view within a point-cloud reconstruction of the local environment. The remote expert can access these views and freely switch between them in Virtual Reality (VR) for better awareness of the local worker’s surroundings. We investigate the remote experts’ behaviours while using the hybrid view interface during typical pick-and-place remote guiding tasks. We found that the remote experts prefer to learn the local physical layout and search for the targets with a global perspective from the 3D free view. The results also showed that the experts chose to use the 360° live view with independent view control rather than the 2D first-person view with high-resolution to control the task procedures and check the local worker’s actions. Our study informs the viewpoint interface design for a MR remote collaboration system in various guiding scenarios.