Natural 7DoF navigation & interaction in 3D geovisualisations

S. Stannus, A. Lucieer, W. Fu
{"title":"Natural 7DoF navigation & interaction in 3D geovisualisations","authors":"S. Stannus, A. Lucieer, W. Fu","doi":"10.1145/2671015.2671131","DOIUrl":null,"url":null,"abstract":"Geospatial data is becoming increasingly important for many uses, from expert scientific domains to social media interaction. A lot of this data is three-dimensional in nature and in recent years virtual globe software like Google Earth has made visualising such data much easier. However there has not been an equivalent improvement in the way in which users interact with the visualisations. Recognising that navigating a virtual world encompassing data at scales across many orders of magnitude requires a 7-dimensional representation of the view state, we have implemented our AeroSpace 7DoF interaction method [Stannus et al. 2014] in a Geovisualisation testbed built around NASA's World Wind virtual globe software. Our approach expands the two-point touch method [Hancock et al.] to 3D with pinches replacing touches. It surpasses previous two-point work [Wang et al. 2011] [Song et al.] [Schultheis et al. 2012] by allowing direct, bimanual, integral, simultaneous navigation with index-finger-moderated roll around the ambiguous axis, though the virtual hands were offset indirectly to mitigate occlusion. Our system was implemented in an immersive environment with 3 large stereo screens for display and an ARTrack3 system for tracking the users head (via retroreflective markers on stereoscopic glasses) and hands. Custom gloves were used to report pinch and pointing gesture contact states via Bluetooth and provide 5DoF tracking via markers on the index fingers.","PeriodicalId":93673,"journal":{"name":"Proceedings of the ACM Symposium on Virtual Reality Software and Technology. ACM Symposium on Virtual Reality Software and Technology","volume":"1 1","pages":"229-230"},"PeriodicalIF":0.0000,"publicationDate":"2014-11-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the ACM Symposium on Virtual Reality Software and Technology. ACM Symposium on Virtual Reality Software and Technology","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2671015.2671131","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3

Abstract

Geospatial data is becoming increasingly important for many uses, from expert scientific domains to social media interaction. A lot of this data is three-dimensional in nature and in recent years virtual globe software like Google Earth has made visualising such data much easier. However there has not been an equivalent improvement in the way in which users interact with the visualisations. Recognising that navigating a virtual world encompassing data at scales across many orders of magnitude requires a 7-dimensional representation of the view state, we have implemented our AeroSpace 7DoF interaction method [Stannus et al. 2014] in a Geovisualisation testbed built around NASA's World Wind virtual globe software. Our approach expands the two-point touch method [Hancock et al.] to 3D with pinches replacing touches. It surpasses previous two-point work [Wang et al. 2011] [Song et al.] [Schultheis et al. 2012] by allowing direct, bimanual, integral, simultaneous navigation with index-finger-moderated roll around the ambiguous axis, though the virtual hands were offset indirectly to mitigate occlusion. Our system was implemented in an immersive environment with 3 large stereo screens for display and an ARTrack3 system for tracking the users head (via retroreflective markers on stereoscopic glasses) and hands. Custom gloves were used to report pinch and pointing gesture contact states via Bluetooth and provide 5DoF tracking via markers on the index fingers.
自然7DoF导航和交互在三维地理可视化
从专业科学领域到社会媒体互动,地理空间数据在许多应用中变得越来越重要。很多这些数据本质上是三维的,近年来,像谷歌地球这样的虚拟地球软件使这些数据的可视化变得更加容易。然而,用户与可视化交互的方式并没有相应的改进。认识到导航包含多个数量级尺度数据的虚拟世界需要视图状态的7维表示,我们已经在围绕NASA世界之风虚拟地球软件构建的地理可视化测试平台中实现了我们的航空航天7DoF交互方法[Stannus等人,2014]。我们的方法将两点触摸方法[Hancock等人]扩展到3D,用捏来代替触摸。它超越了以前的两点工作[Wang et . 2011] [Song et .] [Schultheis et al. 2012],通过允许直接,双手,积分,同时导航,食指在模糊轴上调节滚动,尽管虚拟手被间接偏移以减轻遮挡。我们的系统是在一个身临其境的环境中实现的,有3个大型立体屏幕用于显示,ARTrack3系统用于跟踪用户的头部(通过立体眼镜上的反光标记)和手。定制手套用于通过蓝牙报告捏和指向手势的接触状态,并通过食指上的标记提供5DoF跟踪。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信