SHIFT-Sliding and DEPTH-POP for 3D Positioning

Junwei Sun, W. Stuerzlinger, Dmitri Shuralyov
{"title":"SHIFT-Sliding and DEPTH-POP for 3D Positioning","authors":"Junwei Sun, W. Stuerzlinger, Dmitri Shuralyov","doi":"10.1145/2983310.2985748","DOIUrl":null,"url":null,"abstract":"Moving objects is an important task in 3D user interfaces. We describe two new techniques for 3D positioning, designed for a mouse, but usable with other input devices. The techniques enable rapid, yet easy-to-use positioning of objects in 3D scenes. With sliding, the object follows the cursor and moves on the surfaces of the scene. Our techniques enable precise positioning of constrained objects. Sliding assumes that by default objects stay in contact with the scene's front surfaces, are always at least partially visible, and do not interpenetrate other objects. With our new Shift-Sliding method the user can override these default assumptions and lift objects into the air or make them collide with other objects. Shift-Sliding uses the local coordinate system of the surface that the object was last in contact with, which is a new form of context-dependent manipulation. We also present Depth-Pop, which maps mouse wheel actions to all object positions along the mouse ray, where the object meets the default assumptions for sliding. For efficiency, both methods use frame buffer techniques. Two user studies show that the new techniques significantly speed up common 3D positioning tasks.","PeriodicalId":185819,"journal":{"name":"Proceedings of the 2016 Symposium on Spatial User Interaction","volume":"53 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"9","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2016 Symposium on Spatial User Interaction","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2983310.2985748","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 9

Abstract

Moving objects is an important task in 3D user interfaces. We describe two new techniques for 3D positioning, designed for a mouse, but usable with other input devices. The techniques enable rapid, yet easy-to-use positioning of objects in 3D scenes. With sliding, the object follows the cursor and moves on the surfaces of the scene. Our techniques enable precise positioning of constrained objects. Sliding assumes that by default objects stay in contact with the scene's front surfaces, are always at least partially visible, and do not interpenetrate other objects. With our new Shift-Sliding method the user can override these default assumptions and lift objects into the air or make them collide with other objects. Shift-Sliding uses the local coordinate system of the surface that the object was last in contact with, which is a new form of context-dependent manipulation. We also present Depth-Pop, which maps mouse wheel actions to all object positions along the mouse ray, where the object meets the default assumptions for sliding. For efficiency, both methods use frame buffer techniques. Two user studies show that the new techniques significantly speed up common 3D positioning tasks.
SHIFT-Sliding和DEPTH-POP用于3D定位
移动对象是三维用户界面中的一项重要任务。我们描述了两种新的3D定位技术,专为鼠标设计,但可用于其他输入设备。该技术能够在3D场景中快速且易于使用地定位物体。使用滑动,对象跟随光标并在场景的表面上移动。我们的技术能够精确定位受约束的物体。滑动假设默认情况下,物体与场景的前表面保持接触,至少部分可见,并且不穿透其他物体。使用我们新的shift -滑动方法,用户可以覆盖这些默认假设,并将物体提升到空中或使它们与其他物体碰撞。Shift-Sliding使用对象最后接触的表面的局部坐标系统,这是一种新的依赖于上下文的操作形式。我们还介绍了Depth-Pop,它将鼠标滚轮动作映射到沿着鼠标射线的所有对象位置,对象满足滑动的默认假设。为了提高效率,这两种方法都使用了帧缓冲技术。两项用户研究表明,新技术显著加快了常见的3D定位任务。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信