{"title":"自然7DoF导航和交互在三维地理可视化","authors":"S. Stannus, A. Lucieer, W. Fu","doi":"10.1145/2671015.2671131","DOIUrl":null,"url":null,"abstract":"Geospatial data is becoming increasingly important for many uses, from expert scientific domains to social media interaction. A lot of this data is three-dimensional in nature and in recent years virtual globe software like Google Earth has made visualising such data much easier. However there has not been an equivalent improvement in the way in which users interact with the visualisations. Recognising that navigating a virtual world encompassing data at scales across many orders of magnitude requires a 7-dimensional representation of the view state, we have implemented our AeroSpace 7DoF interaction method [Stannus et al. 2014] in a Geovisualisation testbed built around NASA's World Wind virtual globe software. Our approach expands the two-point touch method [Hancock et al.] to 3D with pinches replacing touches. It surpasses previous two-point work [Wang et al. 2011] [Song et al.] [Schultheis et al. 2012] by allowing direct, bimanual, integral, simultaneous navigation with index-finger-moderated roll around the ambiguous axis, though the virtual hands were offset indirectly to mitigate occlusion. Our system was implemented in an immersive environment with 3 large stereo screens for display and an ARTrack3 system for tracking the users head (via retroreflective markers on stereoscopic glasses) and hands. Custom gloves were used to report pinch and pointing gesture contact states via Bluetooth and provide 5DoF tracking via markers on the index fingers.","PeriodicalId":93673,"journal":{"name":"Proceedings of the ACM Symposium on Virtual Reality Software and Technology. ACM Symposium on Virtual Reality Software and Technology","volume":"1 1","pages":"229-230"},"PeriodicalIF":0.0000,"publicationDate":"2014-11-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":"{\"title\":\"Natural 7DoF navigation & interaction in 3D geovisualisations\",\"authors\":\"S. Stannus, A. Lucieer, W. Fu\",\"doi\":\"10.1145/2671015.2671131\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Geospatial data is becoming increasingly important for many uses, from expert scientific domains to social media interaction. A lot of this data is three-dimensional in nature and in recent years virtual globe software like Google Earth has made visualising such data much easier. However there has not been an equivalent improvement in the way in which users interact with the visualisations. Recognising that navigating a virtual world encompassing data at scales across many orders of magnitude requires a 7-dimensional representation of the view state, we have implemented our AeroSpace 7DoF interaction method [Stannus et al. 2014] in a Geovisualisation testbed built around NASA's World Wind virtual globe software. Our approach expands the two-point touch method [Hancock et al.] to 3D with pinches replacing touches. It surpasses previous two-point work [Wang et al. 2011] [Song et al.] [Schultheis et al. 2012] by allowing direct, bimanual, integral, simultaneous navigation with index-finger-moderated roll around the ambiguous axis, though the virtual hands were offset indirectly to mitigate occlusion. Our system was implemented in an immersive environment with 3 large stereo screens for display and an ARTrack3 system for tracking the users head (via retroreflective markers on stereoscopic glasses) and hands. Custom gloves were used to report pinch and pointing gesture contact states via Bluetooth and provide 5DoF tracking via markers on the index fingers.\",\"PeriodicalId\":93673,\"journal\":{\"name\":\"Proceedings of the ACM Symposium on Virtual Reality Software and Technology. ACM Symposium on Virtual Reality Software and Technology\",\"volume\":\"1 1\",\"pages\":\"229-230\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2014-11-11\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"3\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the ACM Symposium on Virtual Reality Software and Technology. ACM Symposium on Virtual Reality Software and Technology\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/2671015.2671131\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the ACM Symposium on Virtual Reality Software and Technology. ACM Symposium on Virtual Reality Software and Technology","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2671015.2671131","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3
摘要
从专业科学领域到社会媒体互动,地理空间数据在许多应用中变得越来越重要。很多这些数据本质上是三维的,近年来,像谷歌地球这样的虚拟地球软件使这些数据的可视化变得更加容易。然而,用户与可视化交互的方式并没有相应的改进。认识到导航包含多个数量级尺度数据的虚拟世界需要视图状态的7维表示,我们已经在围绕NASA世界之风虚拟地球软件构建的地理可视化测试平台中实现了我们的航空航天7DoF交互方法[Stannus等人,2014]。我们的方法将两点触摸方法[Hancock等人]扩展到3D,用捏来代替触摸。它超越了以前的两点工作[Wang et . 2011] [Song et .] [Schultheis et al. 2012],通过允许直接,双手,积分,同时导航,食指在模糊轴上调节滚动,尽管虚拟手被间接偏移以减轻遮挡。我们的系统是在一个身临其境的环境中实现的,有3个大型立体屏幕用于显示,ARTrack3系统用于跟踪用户的头部(通过立体眼镜上的反光标记)和手。定制手套用于通过蓝牙报告捏和指向手势的接触状态,并通过食指上的标记提供5DoF跟踪。
Natural 7DoF navigation & interaction in 3D geovisualisations
Geospatial data is becoming increasingly important for many uses, from expert scientific domains to social media interaction. A lot of this data is three-dimensional in nature and in recent years virtual globe software like Google Earth has made visualising such data much easier. However there has not been an equivalent improvement in the way in which users interact with the visualisations. Recognising that navigating a virtual world encompassing data at scales across many orders of magnitude requires a 7-dimensional representation of the view state, we have implemented our AeroSpace 7DoF interaction method [Stannus et al. 2014] in a Geovisualisation testbed built around NASA's World Wind virtual globe software. Our approach expands the two-point touch method [Hancock et al.] to 3D with pinches replacing touches. It surpasses previous two-point work [Wang et al. 2011] [Song et al.] [Schultheis et al. 2012] by allowing direct, bimanual, integral, simultaneous navigation with index-finger-moderated roll around the ambiguous axis, though the virtual hands were offset indirectly to mitigate occlusion. Our system was implemented in an immersive environment with 3 large stereo screens for display and an ARTrack3 system for tracking the users head (via retroreflective markers on stereoscopic glasses) and hands. Custom gloves were used to report pinch and pointing gesture contact states via Bluetooth and provide 5DoF tracking via markers on the index fingers.