2010 IEEE Symposium on 3D User Interfaces (3DUI)最新文献

筛选
英文 中文
Mobile devices as multi-DOF controllers 移动设备作为多自由度控制器
2010 IEEE Symposium on 3D User Interfaces (3DUI) Pub Date : 2010-03-20 DOI: 10.1109/3DUI.2010.5444700
Nicholas Katzakis, M. Hori
{"title":"Mobile devices as multi-DOF controllers","authors":"Nicholas Katzakis, M. Hori","doi":"10.1109/3DUI.2010.5444700","DOIUrl":"https://doi.org/10.1109/3DUI.2010.5444700","url":null,"abstract":"Conventional input devices such as the mouse and keyboard lack in intuitiveness when it comes to 3D manipulation tasks. In this paper, we explore the use of accelerometer and magnetometer equipped mobile phones as 3-DOF controllers in a 3D rotation task. We put the mobile phone up against the established standards, a mouse and a touch pen and compare their performance. Our preliminary evaluation indicates that for this type of task, with only 5 minutes of practice the mobile device is significantly faster than both the mouse and the touch pen.","PeriodicalId":144717,"journal":{"name":"2010 IEEE Symposium on 3D User Interfaces (3DUI)","volume":"29 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122385772","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 17
Improving co-located collaboration with show-through techniques 通过展示技术改进同一位置的协作
2010 IEEE Symposium on 3D User Interfaces (3DUI) Pub Date : 2010-03-20 DOI: 10.1109/3DUI.2010.5444719
F. Argelaguet, André Kunert, Alexander Kulik, B. Fröhlich
{"title":"Improving co-located collaboration with show-through techniques","authors":"F. Argelaguet, André Kunert, Alexander Kulik, B. Fröhlich","doi":"10.1109/3DUI.2010.5444719","DOIUrl":"https://doi.org/10.1109/3DUI.2010.5444719","url":null,"abstract":"Multi-user virtual reality systems enable natural interaction with shared virtual worlds. Users can talk to each other, gesture and point into the virtual scenery as if it were real. As in reality, referring to objects by pointing, results often in a situation whereon objects are occluded from the other users' viewpoints. While in reality this problem can only be solved by adapting the viewing position, specialized individual views of the shared virtual scene enable various other solutions. As one such solution we propose show-through techniques to make sure that the objects one is pointing to can be seen by others. We analyzed the influence of such augmented viewing techniques on the spatial understanding of the scene, the rapidity of mutual information exchange as well as the social behavior of users. The results of our user study revealed that show-through techniques support spatial understanding on a similar level as walking around to achieve a non-occluded view of specified objects. However, advantages in terms of comfort, user acceptance and compliance to social protocols could be shown, which suggest that virtual reality techniques can in fact be better than 3D reality.","PeriodicalId":144717,"journal":{"name":"2010 IEEE Symposium on 3D User Interfaces (3DUI)","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128159092","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 20
jReality — interactive audiovisual applications across virtual environments jReality -跨虚拟环境的交互式视听应用程序
2010 IEEE Symposium on 3D User Interfaces (3DUI) Pub Date : 2010-03-20 DOI: 10.1109/3DUI.2010.5444708
Peter Brinkmann, Charles G. Gunn, Steffen Weißmann
{"title":"jReality — interactive audiovisual applications across virtual environments","authors":"Peter Brinkmann, Charles G. Gunn, Steffen Weißmann","doi":"10.1109/3DUI.2010.5444708","DOIUrl":"https://doi.org/10.1109/3DUI.2010.5444708","url":null,"abstract":"jReality is a Java scene graph library for creating real-time interactive applications with 3D computer graphics and spatialized audio. Applications written for jReality will run unchanged on software and hardware platforms ranging from desktop machines with a single screen and stereo speakers to immersive virtual environments with motion tracking, multiple screens with 3D stereo projection, and multi-channel audio setups. In addition to euclidean geometry, jReality supports hyperbolic and elliptic geometry. jReality comes with a number of graphics rendering backends, ranging from pure software to hardware-accelerated to photorealistic. A distributed backend is available for cluster-based virtual environments. Audio backends range from a basic stereo renderer to a high-performance Ambisonics renderer for arbitrary 3D speaker configurations. jReality achieves device-independent user interaction through a layer of abstract input devices that are matched at runtime with available physical devices, so that a jReality application will work with keyboard and mouse in a desktop environment as well as with motion tracking in a virtual environment.","PeriodicalId":144717,"journal":{"name":"2010 IEEE Symposium on 3D User Interfaces (3DUI)","volume":"169 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132243406","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Towards a handheld stereo projector system for viewing and interacting in virtual worlds 朝着在虚拟世界中观看和互动的手持式立体投影仪系统的方向发展
2010 IEEE Symposium on 3D User Interfaces (3DUI) Pub Date : 2010-03-20 DOI: 10.1109/3DUI.2010.5444701
Andrew Miller, J. Laviola
{"title":"Towards a handheld stereo projector system for viewing and interacting in virtual worlds","authors":"Andrew Miller, J. Laviola","doi":"10.1109/3DUI.2010.5444701","DOIUrl":"https://doi.org/10.1109/3DUI.2010.5444701","url":null,"abstract":"We present a proof-of-concept implementation of a handheld stereo projection display system for virtual worlds. We utilize a single pico projector coupled with a six DOF tracker to generate real-time stereo imagery that can be projected on walls or a projection screen.We discuss the iterative design of our display system, including three attempts at modifying our portable projector to produce stereo imagery, and the hardware and software tradeoff decisions made in our prototype.","PeriodicalId":144717,"journal":{"name":"2010 IEEE Symposium on 3D User Interfaces (3DUI)","volume":" 26","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133121666","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
A multi-touch enabled human-transporter metaphor for virtual 3D traveling 一个多点触控的虚拟3D旅行的人类运输隐喻
2010 IEEE Symposium on 3D User Interfaces (3DUI) Pub Date : 2010-03-20 DOI: 10.1109/3DUI.2010.5444715
Dimitar Valkov, Frank Steinicke, G. Bruder, K. Hinrichs
{"title":"A multi-touch enabled human-transporter metaphor for virtual 3D traveling","authors":"Dimitar Valkov, Frank Steinicke, G. Bruder, K. Hinrichs","doi":"10.1109/3DUI.2010.5444715","DOIUrl":"https://doi.org/10.1109/3DUI.2010.5444715","url":null,"abstract":"In this tech-note we demonstrate how multi-touch hand gestures in combination with foot gestures can be used to perform navigation tasks in interactive 3D environments. Geographic Information Systems (GIS) are well suited as a complex testbed for evaluation of user interfaces based on multi-modal input. Recent developments in the area of interactive surfaces enable the construction of low-cost multi-touch displays and relatively inexpensive sensor technology to detect foot gestures, which allows to explore these input modalities for virtual reality environments. In this tech-note, we describe an intuitive 3D user interface metaphor and corresponding hardware, which combine multi-touch hand and foot gestures for interaction with spatial data.","PeriodicalId":144717,"journal":{"name":"2010 IEEE Symposium on 3D User Interfaces (3DUI)","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126050899","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 14
A tangible user interface using spatial Augmented Reality 使用空间增强现实的有形用户界面
2010 IEEE Symposium on 3D User Interfaces (3DUI) Pub Date : 2010-03-20 DOI: 10.1109/3DUI.2010.5444699
L. Chan, H. Lau
{"title":"A tangible user interface using spatial Augmented Reality","authors":"L. Chan, H. Lau","doi":"10.1109/3DUI.2010.5444699","DOIUrl":"https://doi.org/10.1109/3DUI.2010.5444699","url":null,"abstract":"In this paper, we describe the novel implementation of a tangible user interface framework, namely the MagicPad, inspired by the concept of Spatial Augmented Reality. By using an Infrared pen with any flat surface, such as a paper pad that receives projected images from a projector, a user is able to perform a variety of interactive visualization and manipulation in the 3D space. Two implementations using the MagicPad framework are presented, which include the magic lenses like interface inside a CAVE-like system and a virtual book in an art installation.","PeriodicalId":144717,"journal":{"name":"2010 IEEE Symposium on 3D User Interfaces (3DUI)","volume":"124 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121822250","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 13
Walking up and down in immersive virtual worlds: Novel interactive techniques based on visual feedback 在沉浸式虚拟世界中上下行走:基于视觉反馈的新型交互技术
2010 IEEE Symposium on 3D User Interfaces (3DUI) Pub Date : 2010-03-20 DOI: 10.1109/3DUI.2010.5446238
M. Marchal, A. Lécuyer, G. Cirio, L. Bonnet, Mathieu Emily
{"title":"Walking up and down in immersive virtual worlds: Novel interactive techniques based on visual feedback","authors":"M. Marchal, A. Lécuyer, G. Cirio, L. Bonnet, Mathieu Emily","doi":"10.1109/3DUI.2010.5446238","DOIUrl":"https://doi.org/10.1109/3DUI.2010.5446238","url":null,"abstract":"We introduce novel interactive techniques to simulate the sensation of walking up and down in immersive virtual worlds based on visual feedback. Our method consists in modifying the motion of the virtual subjective camera while the user is really walking in an immersive virtual environment. The modification of the virtual viewpoint is a function of the variations in the height of the virtual ground. Three effects are proposed: (1) a straightforward modification of the camera's height, (2) a modification of the camera's navigation velocity, (3) a modification of the camera's orientation. They were tested in an immersive virtual reality setup in which the user is really walking. A Desktop configuration where the user is seated and controls input devices was also tested and compared to the real walking configuration. Experimental results show that our visual techniques are very efficient for the simulation of two canonical shapes: bumps and holes located on the ground. Interestingly, a strong ¿orientation-height illusion¿ is found, as changes in pitch viewing orientation produce perception of height changes (although camera's height remains strictly the same in this case). Our visual effects could be applied in various virtual reality applications such as urban or architectural project reviews or training, as well as in videogames, in order to provide the sensation of walking on uneven grounds.","PeriodicalId":144717,"journal":{"name":"2010 IEEE Symposium on 3D User Interfaces (3DUI)","volume":"292 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116528622","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 40
A framework for volume segmentation and visualization using Augmented Reality 使用增强现实的体积分割和可视化框架
2010 IEEE Symposium on 3D User Interfaces (3DUI) Pub Date : 2010-03-20 DOI: 10.1109/3DUI.2010.5444707
Takehiro Tawara, K. Ono
{"title":"A framework for volume segmentation and visualization using Augmented Reality","authors":"Takehiro Tawara, K. Ono","doi":"10.1109/3DUI.2010.5444707","DOIUrl":"https://doi.org/10.1109/3DUI.2010.5444707","url":null,"abstract":"We propose a two-handed direct manipulation system to achieve complex volume segmentation of CT/MRI data in Augmented Reality with a remote controller attached to a motion tracking cube. At the same time segmented data is displayed by direct volume rendering using a programmable GPU. Our system achieves visualization of real time modification of volume data with complex shading including transparency control by changing transfer functions, displaying any cross section, and rendering multi materials using a local illumination model. Our goal is to build a system that facilitates direct manipulation of volumetric CT/MRI data for segmentation in Augmented Reality. Volume segmentation is a challenging problem and segmented data has an important role for visualization and analysis.","PeriodicalId":144717,"journal":{"name":"2010 IEEE Symposium on 3D User Interfaces (3DUI)","volume":"16 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128027916","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 18
Extending the virtual trackball metaphor to rear touch input 将虚拟轨迹球比喻扩展到后触控输入
2010 IEEE Symposium on 3D User Interfaces (3DUI) Pub Date : 2010-03-20 DOI: 10.1109/3DUI.2010.5444712
Sven G. Kratz, M. Rohs
{"title":"Extending the virtual trackball metaphor to rear touch input","authors":"Sven G. Kratz, M. Rohs","doi":"10.1109/3DUI.2010.5444712","DOIUrl":"https://doi.org/10.1109/3DUI.2010.5444712","url":null,"abstract":"Interaction with 3D objects and scenes is becoming increasingly important on mobile devices. We explore 3D object rotation as a fundamental interaction task. We propose an extension of the virtual trackball metaphor, which is typically restricted to a half sphere and single-sided interaction, to actually use a full sphere. The extension is enabled by a hardware setup called the “iPhone Sandwich,” which allows for simultaneous front-and-back touch input. This setup makes the rear part of the virtual trackball accessible for direct interaction and thus achieves the realization of the virtual trackball metaphor to its full extent. We conducted a user study that shows that a back-of-device virtual trackball is as effective as a front-of-device virtual trackball and that both outperform an implementation of tilt-based input.","PeriodicalId":144717,"journal":{"name":"2010 IEEE Symposium on 3D User Interfaces (3DUI)","volume":"14 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122373674","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 15
An evaluation of menu properties and pointing techniques in a projection-based VR environment 在基于投影的VR环境中评估菜单属性和指向技术
2010 IEEE Symposium on 3D User Interfaces (3DUI) Pub Date : 2010-03-20 DOI: 10.1109/3DUI.2010.5444721
Kaushik Das, C. Borst
{"title":"An evaluation of menu properties and pointing techniques in a projection-based VR environment","authors":"Kaushik Das, C. Borst","doi":"10.1109/3DUI.2010.5444721","DOIUrl":"https://doi.org/10.1109/3DUI.2010.5444721","url":null,"abstract":"We studied menu performance for a rear-projected VR system. Our experiment considered layout (pie vs. linear list), placement (fixed vs. contextual), and pointing method (ray vs. alternatives that we call PAM: pointer-attached-to-menu). We also discuss results with respect to breadth (number of menu items) and depth (top-level and child menus). Standard ray pointing was usually faster than PAM, especially in second-level (child) menus, but error rates were lower for PAM in some cases. Subjective ratings were higher for ray-casting. Pie menus performed better than list layouts. Contextual pop-up menus were faster than fixed location.","PeriodicalId":144717,"journal":{"name":"2010 IEEE Symposium on 3D User Interfaces (3DUI)","volume":"108 12 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126078408","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 31
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信