{"title":"Mobile devices as multi-DOF controllers","authors":"Nicholas Katzakis, M. Hori","doi":"10.1109/3DUI.2010.5444700","DOIUrl":"https://doi.org/10.1109/3DUI.2010.5444700","url":null,"abstract":"Conventional input devices such as the mouse and keyboard lack in intuitiveness when it comes to 3D manipulation tasks. In this paper, we explore the use of accelerometer and magnetometer equipped mobile phones as 3-DOF controllers in a 3D rotation task. We put the mobile phone up against the established standards, a mouse and a touch pen and compare their performance. Our preliminary evaluation indicates that for this type of task, with only 5 minutes of practice the mobile device is significantly faster than both the mouse and the touch pen.","PeriodicalId":144717,"journal":{"name":"2010 IEEE Symposium on 3D User Interfaces (3DUI)","volume":"29 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122385772","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
F. Argelaguet, André Kunert, Alexander Kulik, B. Fröhlich
{"title":"Improving co-located collaboration with show-through techniques","authors":"F. Argelaguet, André Kunert, Alexander Kulik, B. Fröhlich","doi":"10.1109/3DUI.2010.5444719","DOIUrl":"https://doi.org/10.1109/3DUI.2010.5444719","url":null,"abstract":"Multi-user virtual reality systems enable natural interaction with shared virtual worlds. Users can talk to each other, gesture and point into the virtual scenery as if it were real. As in reality, referring to objects by pointing, results often in a situation whereon objects are occluded from the other users' viewpoints. While in reality this problem can only be solved by adapting the viewing position, specialized individual views of the shared virtual scene enable various other solutions. As one such solution we propose show-through techniques to make sure that the objects one is pointing to can be seen by others. We analyzed the influence of such augmented viewing techniques on the spatial understanding of the scene, the rapidity of mutual information exchange as well as the social behavior of users. The results of our user study revealed that show-through techniques support spatial understanding on a similar level as walking around to achieve a non-occluded view of specified objects. However, advantages in terms of comfort, user acceptance and compliance to social protocols could be shown, which suggest that virtual reality techniques can in fact be better than 3D reality.","PeriodicalId":144717,"journal":{"name":"2010 IEEE Symposium on 3D User Interfaces (3DUI)","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128159092","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Peter Brinkmann, Charles G. Gunn, Steffen Weißmann
{"title":"jReality — interactive audiovisual applications across virtual environments","authors":"Peter Brinkmann, Charles G. Gunn, Steffen Weißmann","doi":"10.1109/3DUI.2010.5444708","DOIUrl":"https://doi.org/10.1109/3DUI.2010.5444708","url":null,"abstract":"jReality is a Java scene graph library for creating real-time interactive applications with 3D computer graphics and spatialized audio. Applications written for jReality will run unchanged on software and hardware platforms ranging from desktop machines with a single screen and stereo speakers to immersive virtual environments with motion tracking, multiple screens with 3D stereo projection, and multi-channel audio setups. In addition to euclidean geometry, jReality supports hyperbolic and elliptic geometry. jReality comes with a number of graphics rendering backends, ranging from pure software to hardware-accelerated to photorealistic. A distributed backend is available for cluster-based virtual environments. Audio backends range from a basic stereo renderer to a high-performance Ambisonics renderer for arbitrary 3D speaker configurations. jReality achieves device-independent user interaction through a layer of abstract input devices that are matched at runtime with available physical devices, so that a jReality application will work with keyboard and mouse in a desktop environment as well as with motion tracking in a virtual environment.","PeriodicalId":144717,"journal":{"name":"2010 IEEE Symposium on 3D User Interfaces (3DUI)","volume":"169 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132243406","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Towards a handheld stereo projector system for viewing and interacting in virtual worlds","authors":"Andrew Miller, J. Laviola","doi":"10.1109/3DUI.2010.5444701","DOIUrl":"https://doi.org/10.1109/3DUI.2010.5444701","url":null,"abstract":"We present a proof-of-concept implementation of a handheld stereo projection display system for virtual worlds. We utilize a single pico projector coupled with a six DOF tracker to generate real-time stereo imagery that can be projected on walls or a projection screen.We discuss the iterative design of our display system, including three attempts at modifying our portable projector to produce stereo imagery, and the hardware and software tradeoff decisions made in our prototype.","PeriodicalId":144717,"journal":{"name":"2010 IEEE Symposium on 3D User Interfaces (3DUI)","volume":" 26","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133121666","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Dimitar Valkov, Frank Steinicke, G. Bruder, K. Hinrichs
{"title":"A multi-touch enabled human-transporter metaphor for virtual 3D traveling","authors":"Dimitar Valkov, Frank Steinicke, G. Bruder, K. Hinrichs","doi":"10.1109/3DUI.2010.5444715","DOIUrl":"https://doi.org/10.1109/3DUI.2010.5444715","url":null,"abstract":"In this tech-note we demonstrate how multi-touch hand gestures in combination with foot gestures can be used to perform navigation tasks in interactive 3D environments. Geographic Information Systems (GIS) are well suited as a complex testbed for evaluation of user interfaces based on multi-modal input. Recent developments in the area of interactive surfaces enable the construction of low-cost multi-touch displays and relatively inexpensive sensor technology to detect foot gestures, which allows to explore these input modalities for virtual reality environments. In this tech-note, we describe an intuitive 3D user interface metaphor and corresponding hardware, which combine multi-touch hand and foot gestures for interaction with spatial data.","PeriodicalId":144717,"journal":{"name":"2010 IEEE Symposium on 3D User Interfaces (3DUI)","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126050899","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A tangible user interface using spatial Augmented Reality","authors":"L. Chan, H. Lau","doi":"10.1109/3DUI.2010.5444699","DOIUrl":"https://doi.org/10.1109/3DUI.2010.5444699","url":null,"abstract":"In this paper, we describe the novel implementation of a tangible user interface framework, namely the MagicPad, inspired by the concept of Spatial Augmented Reality. By using an Infrared pen with any flat surface, such as a paper pad that receives projected images from a projector, a user is able to perform a variety of interactive visualization and manipulation in the 3D space. Two implementations using the MagicPad framework are presented, which include the magic lenses like interface inside a CAVE-like system and a virtual book in an art installation.","PeriodicalId":144717,"journal":{"name":"2010 IEEE Symposium on 3D User Interfaces (3DUI)","volume":"124 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121822250","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
M. Marchal, A. Lécuyer, G. Cirio, L. Bonnet, Mathieu Emily
{"title":"Walking up and down in immersive virtual worlds: Novel interactive techniques based on visual feedback","authors":"M. Marchal, A. Lécuyer, G. Cirio, L. Bonnet, Mathieu Emily","doi":"10.1109/3DUI.2010.5446238","DOIUrl":"https://doi.org/10.1109/3DUI.2010.5446238","url":null,"abstract":"We introduce novel interactive techniques to simulate the sensation of walking up and down in immersive virtual worlds based on visual feedback. Our method consists in modifying the motion of the virtual subjective camera while the user is really walking in an immersive virtual environment. The modification of the virtual viewpoint is a function of the variations in the height of the virtual ground. Three effects are proposed: (1) a straightforward modification of the camera's height, (2) a modification of the camera's navigation velocity, (3) a modification of the camera's orientation. They were tested in an immersive virtual reality setup in which the user is really walking. A Desktop configuration where the user is seated and controls input devices was also tested and compared to the real walking configuration. Experimental results show that our visual techniques are very efficient for the simulation of two canonical shapes: bumps and holes located on the ground. Interestingly, a strong ¿orientation-height illusion¿ is found, as changes in pitch viewing orientation produce perception of height changes (although camera's height remains strictly the same in this case). Our visual effects could be applied in various virtual reality applications such as urban or architectural project reviews or training, as well as in videogames, in order to provide the sensation of walking on uneven grounds.","PeriodicalId":144717,"journal":{"name":"2010 IEEE Symposium on 3D User Interfaces (3DUI)","volume":"292 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116528622","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A framework for volume segmentation and visualization using Augmented Reality","authors":"Takehiro Tawara, K. Ono","doi":"10.1109/3DUI.2010.5444707","DOIUrl":"https://doi.org/10.1109/3DUI.2010.5444707","url":null,"abstract":"We propose a two-handed direct manipulation system to achieve complex volume segmentation of CT/MRI data in Augmented Reality with a remote controller attached to a motion tracking cube. At the same time segmented data is displayed by direct volume rendering using a programmable GPU. Our system achieves visualization of real time modification of volume data with complex shading including transparency control by changing transfer functions, displaying any cross section, and rendering multi materials using a local illumination model. Our goal is to build a system that facilitates direct manipulation of volumetric CT/MRI data for segmentation in Augmented Reality. Volume segmentation is a challenging problem and segmented data has an important role for visualization and analysis.","PeriodicalId":144717,"journal":{"name":"2010 IEEE Symposium on 3D User Interfaces (3DUI)","volume":"16 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128027916","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Extending the virtual trackball metaphor to rear touch input","authors":"Sven G. Kratz, M. Rohs","doi":"10.1109/3DUI.2010.5444712","DOIUrl":"https://doi.org/10.1109/3DUI.2010.5444712","url":null,"abstract":"Interaction with 3D objects and scenes is becoming increasingly important on mobile devices. We explore 3D object rotation as a fundamental interaction task. We propose an extension of the virtual trackball metaphor, which is typically restricted to a half sphere and single-sided interaction, to actually use a full sphere. The extension is enabled by a hardware setup called the “iPhone Sandwich,” which allows for simultaneous front-and-back touch input. This setup makes the rear part of the virtual trackball accessible for direct interaction and thus achieves the realization of the virtual trackball metaphor to its full extent. We conducted a user study that shows that a back-of-device virtual trackball is as effective as a front-of-device virtual trackball and that both outperform an implementation of tilt-based input.","PeriodicalId":144717,"journal":{"name":"2010 IEEE Symposium on 3D User Interfaces (3DUI)","volume":"14 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122373674","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"An evaluation of menu properties and pointing techniques in a projection-based VR environment","authors":"Kaushik Das, C. Borst","doi":"10.1109/3DUI.2010.5444721","DOIUrl":"https://doi.org/10.1109/3DUI.2010.5444721","url":null,"abstract":"We studied menu performance for a rear-projected VR system. Our experiment considered layout (pie vs. linear list), placement (fixed vs. contextual), and pointing method (ray vs. alternatives that we call PAM: pointer-attached-to-menu). We also discuss results with respect to breadth (number of menu items) and depth (top-level and child menus). Standard ray pointing was usually faster than PAM, especially in second-level (child) menus, but error rates were lower for PAM in some cases. Subjective ratings were higher for ray-casting. Pie menus performed better than list layouts. Contextual pop-up menus were faster than fixed location.","PeriodicalId":144717,"journal":{"name":"2010 IEEE Symposium on 3D User Interfaces (3DUI)","volume":"108 12 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126078408","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}