2008 IEEE Symposium on 3D User Interfaces最新文献

筛选
英文 中文
Poster: Image-Based 3D Display with Motion Parallax using Face Tracking 海报:基于图像的3D显示与运动视差使用面部跟踪
2008 IEEE Symposium on 3D User Interfaces Pub Date : 2008-03-08 DOI: 10.1109/3DUI.2008.4476617
T. Suenaga, Y. Tamai, Y. Kurita, Y. Matsumoto, T. Ogasawara
{"title":"Poster: Image-Based 3D Display with Motion Parallax using Face Tracking","authors":"T. Suenaga, Y. Tamai, Y. Kurita, Y. Matsumoto, T. Ogasawara","doi":"10.1109/3DUI.2008.4476617","DOIUrl":"https://doi.org/10.1109/3DUI.2008.4476617","url":null,"abstract":"We propose an image-based 3D display with motion parallax using face tracking. Multi-view images of target objects are recorded in advance by utilizing a camera mounted on a 6 DOF manipulator. The 3D viewpoint of the user is measured by using a real-time non-contact face measurement system. One of the multi-view images is projected according as the camera position corresponding to the viewpoint of the user. The user can obtain 3D information of the object through a standard monitor by moving his/her head. A prototype system is developed and the consistency of the proposed method is confirmed by comparing generated images with captured images at the same viewpoints.","PeriodicalId":131574,"journal":{"name":"2008 IEEE Symposium on 3D User Interfaces","volume":"65 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-03-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116239759","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 7
Elastic Control for Navigation Tasks on Pen-based Handheld Computers 手持式笔式计算机导航任务的弹性控制
2008 IEEE Symposium on 3D User Interfaces Pub Date : 2008-03-08 DOI: 10.1109/3DUI.2008.4476597
M. Hachet, Alexander Kulik
{"title":"Elastic Control for Navigation Tasks on Pen-based Handheld Computers","authors":"M. Hachet, Alexander Kulik","doi":"10.1109/3DUI.2008.4476597","DOIUrl":"https://doi.org/10.1109/3DUI.2008.4476597","url":null,"abstract":"Isotonic pen and finger interfaces for handheld devices are very suitable for many interaction tasks (eg. pointing, drawing). However, they are not appropriate for rate controlled techniques, as required for other tasks such as navigation in 3D environments. In this paper, we investigate the influence of elastic feedback to enhance user performance in rate controlled interaction tasks. We conducted an experiment, which proves evidence that elastic feedback, given to input movements with the pen, provides better control for 3D travel tasks. Based on these findings, we designed several prototypes that illustrate the ease of applying various elastic control conditions to contemporary handheld computers with finger- or pen-based input capabilities.","PeriodicalId":131574,"journal":{"name":"2008 IEEE Symposium on 3D User Interfaces","volume":"415 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-03-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124160242","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 13
Tech-note: Dynamic Dragging for Input of 3D Trajectories 技术说明:3D轨迹输入的动态拖动
2008 IEEE Symposium on 3D User Interfaces Pub Date : 2008-03-08 DOI: 10.1109/3DUI.2008.4476591
Daniel F. Keefe, R. Zeleznik, D. Laidlaw
{"title":"Tech-note: Dynamic Dragging for Input of 3D Trajectories","authors":"Daniel F. Keefe, R. Zeleznik, D. Laidlaw","doi":"10.1109/3DUI.2008.4476591","DOIUrl":"https://doi.org/10.1109/3DUI.2008.4476591","url":null,"abstract":"We present Dynamic Dragging, a virtual reality (VR) technique for input of smooth 3D trajectories with varying curvature. Users \"drag\" a virtual pen behind a hand-held tracked stylus to sweep out curving 3D paths in the air. Previous explorations of dragging-style input have established its utility for producing controlled, smooth inputs relative to freehand alternatives. However, a limitation of previous techniques is the reliance on a fixed-length drag line, biasing input toward trajectories of a particular curvature range. Dynamic Dragging explores the design space of techniques utilizing an adaptive drag line that adjusts length dynamically based on the local properties of the input, such as curvature and drawing speed. Finding the right mapping from these local properties to drag line length proves to be critical and challenging. Three potential mappings have been explored, and results of informal evaluations are reported. Initial findings indicate that Dynamic Dragging makes input of many styles of 3D curves easier than traditional drag-style input, allowing drag techniques to approach the flexibility for varied input of more sophisticated and much harder to learn techniques, such as two-handed tape drawing.","PeriodicalId":131574,"journal":{"name":"2008 IEEE Symposium on 3D User Interfaces","volume":"2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-03-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127694757","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 29
Tech-note: 4Record - recording and controlling spatiotemporal paths in virtual environments 技术说明:4记录-在虚拟环境中记录和控制时空路径
2008 IEEE Symposium on 3D User Interfaces Pub Date : 2008-03-08 DOI: 10.1109/3DUI.2008.4476599
Mario Doulis, Marco Rietmann, J. Pflüger
{"title":"Tech-note: 4Record - recording and controlling spatiotemporal paths in virtual environments","authors":"Mario Doulis, Marco Rietmann, J. Pflüger","doi":"10.1109/3DUI.2008.4476599","DOIUrl":"https://doi.org/10.1109/3DUI.2008.4476599","url":null,"abstract":"4Record is a tool for creating and controlling spatiotemporal (4D) paths in a virtual environment (VE). It has been originally developed as a component of an application that uses 4D CAD models of buildings (with time as 4th dimension showing the building process progression) for the interactive visualization of construction processes. However its plug-in like design allows the integration in different applications. 4Record allows recording and replaying user navigation through space and trough building process time, removing and re-recording these paths at any desired point (in space and time), and to save or load them. We separated main 3D interaction in the VE from path recording and traveling through the building process time of the 4D model and developed one input device for each task. Using a dedicated input device similar to mobile devices like mp3 players or voice recorders as physical control for time travel and path recording, we want novice users to become quickly familiar with 4Record and to achieve a high ease of use of the whole interface.","PeriodicalId":131574,"journal":{"name":"2008 IEEE Symposium on 3D User Interfaces","volume":"11 11","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-03-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134035227","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Tech Note: Digital Foam 技术说明:数字泡沫
2008 IEEE Symposium on 3D User Interfaces Pub Date : 2008-03-08 DOI: 10.1109/3DUI.2008.4476588
Ross T. Smith, B. Thomas, W. Piekarski
{"title":"Tech Note: Digital Foam","authors":"Ross T. Smith, B. Thomas, W. Piekarski","doi":"10.1109/3DUI.2008.4476588","DOIUrl":"https://doi.org/10.1109/3DUI.2008.4476588","url":null,"abstract":"This paper presents a new input device called digital foam designed to support natural sculpting operations similar to those used when sculpting clay. We have constructed two prototypes to test the concept of using a conductive foam input device to create 3D geometries and perform sculpting operations. The novel contributions of this paper include the realization that conductive foam sensors are accurate enough to allow fine grained control of position sensing and can be used to build foam based input devices. We have designed a novel foam sensor array by combining both conductive and non-conductive foam to allow interference free sensor readings to be recorded. We also constructed two novel input devices, one flat input device with one hundred sensors, and a second spherical design with twenty one sensors, both allowing user interactions by touching or squeezing the foam surface. We present the design idea, foam sensor theory, two prototype designs, and the initial application ideas used to explore the possible uses of digital foam.","PeriodicalId":131574,"journal":{"name":"2008 IEEE Symposium on 3D User Interfaces","volume":"205 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-03-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131478546","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 22
Withindows: A Framework for Transitional Desktop and Immersive User Interfaces windows:过渡桌面和沉浸式用户界面的框架
2008 IEEE Symposium on 3D User Interfaces Pub Date : 2008-03-08 DOI: 10.1109/3DUI.2008.4476584
A. Hill, Andrew E. Johnson
{"title":"Withindows: A Framework for Transitional Desktop and Immersive User Interfaces","authors":"A. Hill, Andrew E. Johnson","doi":"10.1109/3DUI.2008.4476584","DOIUrl":"https://doi.org/10.1109/3DUI.2008.4476584","url":null,"abstract":"The uniqueness of 3D interaction is often used to justify levels of user fatigue that are significantly higher than those of desktop systems. Object manipulation and symbolic manipulation techniques based strictly on first person perspective are also generally less efficient than their desktop counterparts. Instead of considering the two environments as distinct, we have focused on the idea that desktop applications will likely need to transition smoothly into full immersion through intermediate states. The Withindows framework uses image-plane selection and through- the-lens techniques in an attempt to smooth the movement of both traditional and immersive applications across transitional states such as desktop stereo and multi-display setups. We propose using a virtual cursor in the dominant eye and a reinforcing cursor in the non-dominant eye to avoid ambiguity problems that have discouraged the use of image-plane selection in stereo. We show how image-plane selection resolves non-linear control-display relationships inherent in some approaches to desktop stereo. When combined with through-the-lens techniques, image-plane selection allows immersive viewpoint management and 2 1/2D object manipulation techniques analogous to those on the desktop. This approach resolves global search and scaling problems inherent in prior through-the-lens implementations. We describe extensions for 6 DOF input devices that do not supersede the default interaction method. We developed a single-authored virtual world builder as a proof of concept application of our framework. Our evaluations found alternate perspectives useful but our implementation of viewing windows proved fatiguing to some users.","PeriodicalId":131574,"journal":{"name":"2008 IEEE Symposium on 3D User Interfaces","volume":"56 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-03-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122264703","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 11
Poster: Toward an Interactive Box-shaped 3D Display: Study of the Requirements for Wide Field of View 海报:开发交互式箱形 3D 显示器:宽视场要求研究
2008 IEEE Symposium on 3D User Interfaces Pub Date : 2008-03-08 DOI: 10.1109/3DUI.2008.4476615
R. Lopez-Gulliver, S. Yoshida, S. Yano, N. Inoue
{"title":"Poster: Toward an Interactive Box-shaped 3D Display: Study of the Requirements for Wide Field of View","authors":"R. Lopez-Gulliver, S. Yoshida, S. Yano, N. Inoue","doi":"10.1109/3DUI.2008.4476615","DOIUrl":"https://doi.org/10.1109/3DUI.2008.4476615","url":null,"abstract":"We propose a graspable box-shaped 3D display as a communication tool that allows multiple users to share and naturally interact with 3D images in face-to-face collaborative tasks. We envision an auto-stereoscopic 3D display featuring glasses- free and multi-viewpoint operation. Users should be able to view multiple faces in the box-shaped display simultaneously and from any direction. We employ the integral photography (IP) method for this purpose. In this paper, we first analyze the requirements for an IP lens allowing simultaneous multi-face viewing of 3D images in such a display. Consequently, we find that a mininum 120-degree field of view is necessary. Then, we design and prototype an IP lens that provides such a wide field of view. Visual inspection of the generated 3D images confirm the possibility of simultaneous multiple- face viewing with the proposed display.","PeriodicalId":131574,"journal":{"name":"2008 IEEE Symposium on 3D User Interfaces","volume":"12 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-03-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128378393","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 5
Poster: ARLIST - an Augmented Reality Environment for Life Support Training 海报:ARLIST -生命支持训练的增强现实环境
2008 IEEE Symposium on 3D User Interfaces Pub Date : 2008-03-08 DOI: 10.1109/3DUI.2008.4476606
Fabrício Pretto, I. Manssour, Emerson Rodrigues da Silva, M. H. Lopes, M. Pinho
{"title":"Poster: ARLIST - an Augmented Reality Environment for Life Support Training","authors":"Fabrício Pretto, I. Manssour, Emerson Rodrigues da Silva, M. H. Lopes, M. Pinho","doi":"10.1109/3DUI.2008.4476606","DOIUrl":"https://doi.org/10.1109/3DUI.2008.4476606","url":null,"abstract":"The main goal of the ARLIST project is to qualify the traditional training environment currently used for Life Support training, introducing image and sound resources into the training manikins. Using these tools we can simulate some aspects such as facial expressions, skin color changes and scratches and skin injuries through image projection over the manikin body, and also play sounds like cries of pain or groans of an injured man.","PeriodicalId":131574,"journal":{"name":"2008 IEEE Symposium on 3D User Interfaces","volume":"25 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-03-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123301975","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Poster: 3-D Display Using Motion Parallax for Outdoor User Interface 海报:户外用户界面使用运动视差的3d显示
2008 IEEE Symposium on 3D User Interfaces Pub Date : 2008-03-08 DOI: 10.1109/3DUI.2008.4476619
K. Uehira, Masahiro Suzuki, Yoshiaki Kobayashi
{"title":"Poster: 3-D Display Using Motion Parallax for Outdoor User Interface","authors":"K. Uehira, Masahiro Suzuki, Yoshiaki Kobayashi","doi":"10.1109/3DUI.2008.4476619","DOIUrl":"https://doi.org/10.1109/3DUI.2008.4476619","url":null,"abstract":"This paper describes a 3-D display that can express differences between depths at extended distances of over a hundred meters for moving observers using motion parallax. The results of a subjective test demonstrated that the perceived depth of the image at long distances could be controlled using artificial motion parallax generated by changing the rate of expansion of the image size, irrespective of its real depth. As a result, we demonstrated its possibility for applications to car-navigation systems combined with augmented-reality technology.","PeriodicalId":131574,"journal":{"name":"2008 IEEE Symposium on 3D User Interfaces","volume":"20 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-03-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116936956","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
TubeMouse - A Two-Handed Input Device for Flexible Objects 管鼠标-一个双手输入设备的柔性对象
2008 IEEE Symposium on 3D User Interfaces Pub Date : 2008-03-08 DOI: 10.1109/3DUI.2008.4476587
C. Geiger, O. Rattay
{"title":"TubeMouse - A Two-Handed Input Device for Flexible Objects","authors":"C. Geiger, O. Rattay","doi":"10.1109/3DUI.2008.4476587","DOIUrl":"https://doi.org/10.1109/3DUI.2008.4476587","url":null,"abstract":"We present the design and realization of a two-handed input device that is dedicated to the interactive simulation of flexible objects. A set of symmetrical and asymmetrical bimanual interactions techniques has been designed and implemented in a VR simulation system that is used for routing design of cables, hoses and wiring harnesses. A preliminary expert review study showed the benefits of our approach.","PeriodicalId":131574,"journal":{"name":"2008 IEEE Symposium on 3D User Interfaces","volume":"137 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-03-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121961528","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信