2008 IEEE Symposium on 3D User Interfaces最新文献

筛选
英文 中文
Navidget for Easy 3D Camera Positioning from 2D Inputs 导航工具,方便3D相机定位从2D输入
2008 IEEE Symposium on 3D User Interfaces Pub Date : 2008-03-08 DOI: 10.1109/3DUI.2008.4476596
M. Hachet, Fabrice Decle, Sebastian Knödel, P. Guitton
{"title":"Navidget for Easy 3D Camera Positioning from 2D Inputs","authors":"M. Hachet, Fabrice Decle, Sebastian Knödel, P. Guitton","doi":"10.1109/3DUI.2008.4476596","DOIUrl":"https://doi.org/10.1109/3DUI.2008.4476596","url":null,"abstract":"Navidget is a new interaction technique for camera positioning in 3D environments. This technique derives from the point-of-interest (POI) approaches where the endpoint of a trajectory is selected for smooth camera motions. Unlike the existing POI techniques, Navidget does not attempt to automatically estimate where and how the user wants to move. Instead, it provides good feedback and control for fast and easy interactive camera positioning. Navidget can also be useful for distant inspection when used with a preview window. This new 3D User interface is totally based on 2D inputs. As a result, it is appropriate for a wide variety of visualization systems, from small handheld devices to large interactive displays. A user study on TabletPC shows that the usability of Navidget is very good for both expert and novice users. This new technique is more appropriate than the conventional 3D viewer interfaces for some camera positioning tasks in 3D environments.","PeriodicalId":131574,"journal":{"name":"2008 IEEE Symposium on 3D User Interfaces","volume":"08 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-03-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134421197","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 90
A Haptic Virtual Borescope for Visual Engine Inspection Training 用于视觉引擎检测培训的触觉虚拟内窥镜
2008 IEEE Symposium on 3D User Interfaces Pub Date : 2008-03-08 DOI: 10.1109/3DUI.2008.4476586
Deepak S. Vembar, A. Duchowski, Sajay Sadasivan, A. Gramopadhye
{"title":"A Haptic Virtual Borescope for Visual Engine Inspection Training","authors":"Deepak S. Vembar, A. Duchowski, Sajay Sadasivan, A. Gramopadhye","doi":"10.1109/3DUI.2008.4476586","DOIUrl":"https://doi.org/10.1109/3DUI.2008.4476586","url":null,"abstract":"A haptic virtual borescope is developed for the purpose of aircraft engine inspection training, similar in spirit to borescope trainers intended for use in gas turbine maintenance training schools. Such devices consist of engine section mockups for use with a real borescope. Our approach instead simulates engine sections in virtual reality, replacing the need for physical mockups. We model the engine casing as a \"black box\" where a simulated borescope tip is inserted (in practice a real borescope is used to provide tactile veridicality of the probe's braided sheath but the camera at its tip is not used). The probe's translational movement is mapped to the virtual camera's. The graphical engine representation can conceivably generalize to any engine section that can be modeled graphically. Since the interior chamber of the \"black box\" casing is empty, the critical component of our simulator is correct borescope tip navigation as well as force feedback response based on a mathematical model of collision detection of the tip in the computer generated environment. Haptic response is thought to be a key component of the simulator as it provides non-visual tactile awareness of the borescope tip within the engine under inspection and, more importantly, its contact with engine surfaces. Our contribution is two-fold. First, we design a novel motor-powered clamp that provides collision response to collision of the camera detected in virtual space. Second, we attempt to isolate the effect of the system's tactile response and provide empirical evaluation of its utility. In line with previous results, our empirical analysis reveals a trend toward a benefit in performance (speed), but suggests that the provision of haptic feedback, while preferred over a solely visual interface, may be perceived as extraneous in a visually-dominated discrimination task.","PeriodicalId":131574,"journal":{"name":"2008 IEEE Symposium on 3D User Interfaces","volume":"29 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-03-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114173918","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 8
Poster: The NetEyes Collaborative, Augmented Reality, Digital Paper System 海报:NetEyes协作、增强现实、数字纸张系统
2008 IEEE Symposium on 3D User Interfaces Pub Date : 2008-03-08 DOI: 10.1109/3DUI.2008.4476609
David R. McGee, Xiao Huang, Paulo Barthelmess, Philip R. Cohen
{"title":"Poster: The NetEyes Collaborative, Augmented Reality, Digital Paper System","authors":"David R. McGee, Xiao Huang, Paulo Barthelmess, Philip R. Cohen","doi":"10.1109/3DUI.2008.4476609","DOIUrl":"https://doi.org/10.1109/3DUI.2008.4476609","url":null,"abstract":"NetEyes is a system that allows remote and co-located partners to collaborate by annotating maps printed on digital paper. It combines natural language capabilities, in particular the interpretation of sketched symbols, with the display of three-dimensional representations of recognized objects, allowing users to jointly visualize a planned or evolving situation in detail. Visualization can take place either on a conventional monitor or through optical see-though, head-mounted displays. Seamless collaboration is promoted via the use of tangible paper maps, which makes it possible for multiple parties sitting around a table to place annotations as they would using regular paper and pen. To account for movements and rotation of the maps that are common during collaborative annotation sessions, a vision-based tracking component is used. This component recovers the location of the paper map in the real-world, and scales and rotates the digitally displayed objects so that they keep aligned with the map.","PeriodicalId":131574,"journal":{"name":"2008 IEEE Symposium on 3D User Interfaces","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-03-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122133673","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Poster: Authoring Tool for Intuitive Editing of Avatar Pose Using a Virtual Puppet 海报:使用虚拟木偶直观编辑头像姿势的创作工具
2008 IEEE Symposium on 3D User Interfaces Pub Date : 2008-03-08 DOI: 10.1109/3DUI.2008.4476613
T. Serizawa, Y. Yanagida
{"title":"Poster: Authoring Tool for Intuitive Editing of Avatar Pose Using a Virtual Puppet","authors":"T. Serizawa, Y. Yanagida","doi":"10.1109/3DUI.2008.4476613","DOIUrl":"https://doi.org/10.1109/3DUI.2008.4476613","url":null,"abstract":"An intuitive method for creating an avatar's posture using a virtual puppet is proposed. Using this method, intuitive operation with a physical puppet is preserved, while undesired interference, such as mechanical limitations of the joints, is eliminated. For intuitive manipulation, we developed a method to define the movable range of a joint with multiple degrees of freedom, and implemented this method using motion editor software. In addition, we incorporated a haptic device to work with the motion editor software so that the user can feel as if he/she were operating an actual puppet.","PeriodicalId":131574,"journal":{"name":"2008 IEEE Symposium on 3D User Interfaces","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-03-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132148284","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Poster: Effects of Head Tracking and Stereo on Non-Isomorphic 3D Rotation 海报:头部跟踪和立体对非同构三维旋转的影响
2008 IEEE Symposium on 3D User Interfaces Pub Date : 2008-03-08 DOI: 10.1109/3DUI.2008.4476614
J. Laviola, Andrew S. Forsberg, J. Huffman, Andrew Bragdon
{"title":"Poster: Effects of Head Tracking and Stereo on Non-Isomorphic 3D Rotation","authors":"J. Laviola, Andrew S. Forsberg, J. Huffman, Andrew Bragdon","doi":"10.1109/3DUI.2008.4476614","DOIUrl":"https://doi.org/10.1109/3DUI.2008.4476614","url":null,"abstract":"We present an experimental study that explores how head tracking and stereo affect user performance when rotating 3D virtual objects using isomorphic and non-isomorphic rotation techniques. Our experiment compares isomorphic with non-isomorphic rotation utilizing four different display modes (no head tracking/no stereo, head tracking/no stereo, no head tracking/stereo, and head tracking/stereo) and two different angular error thresholds for task completion. Our results indicate that rotation error is significantly reduced when subjects perform the task using non-isomorphic 3D rotation with head tracking/stereo than with no head tracking/no stereo. In addition, subjects peformed the rotation task with significantly less error with head tracking/stereo and no head tracking/stereo than with no head tracking/no stereo, regardless of rotation technique. Subjects also highly rated the importance of stereo and non-isomorphic amplification in the 3D rotation task.","PeriodicalId":131574,"journal":{"name":"2008 IEEE Symposium on 3D User Interfaces","volume":"25 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-03-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127885662","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 7
3D Virtual Haptic Cone for Intuitive Vehicle Motion Control 直观车辆运动控制的三维虚拟触觉锥
2008 IEEE Symposium on 3D User Interfaces Pub Date : 1900-01-01 DOI: 10.1109/vr.2008.4480792
B. Horan, Z. Najdovski, S. Nahavandi, E. Tunstel
{"title":"3D Virtual Haptic Cone for Intuitive Vehicle Motion Control","authors":"B. Horan, Z. Najdovski, S. Nahavandi, E. Tunstel","doi":"10.1109/vr.2008.4480792","DOIUrl":"https://doi.org/10.1109/vr.2008.4480792","url":null,"abstract":"Haptic human-machine interfaces and interaction techniques have been shown to offer advantages over conventional approaches. This work introduces the 3D virtual haptic cone with the aim of improving human remote control of a vehicle's motion. The 3D cone introduces a third dimension to the haptic control surface over existing approaches. This approach improves upon existing methods by providing the human operator with an intuitive method for issuing vehicle motion commands whilst simultaneously receiving real-time haptic information from the remote system. The presented approach offers potential across many applications, and as a case study, this work considers the approach in the context of mobile robot motion control. The performance of the approach in providing the operator with improved motion controllability is evaluated and the performance improvement determined.","PeriodicalId":131574,"journal":{"name":"2008 IEEE Symposium on 3D User Interfaces","volume":"76 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116352787","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 9
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信