ThirdLight: Low-cost and High-speed 3D Interaction Using Photosensor Markers

Jaewon Kim, Gyuchull Han, Hwasup Lim, S. Izadi, A. Ghosh
{"title":"ThirdLight: Low-cost and High-speed 3D Interaction Using Photosensor Markers","authors":"Jaewon Kim, Gyuchull Han, Hwasup Lim, S. Izadi, A. Ghosh","doi":"10.1145/3150165.3150169","DOIUrl":null,"url":null,"abstract":"We present a low-cost 3D tracking system for virtual reality, gesture modeling, and robot manipulation applications which require fast and precise localization of headsets, data gloves, props, or controllers. Our system removes the need for cameras or projectors for sensing, and instead uses cheap LEDs and printed masks for illumination, and low-cost photosensitive markers. The illumination device transmits a spatiotemporal pattern as a series of binary Gray-code patterns. Multiple illumination devices can be combined to localize each marker in 3D at high speed (333Hz). Our method has strengths in accuracy, speed, cost, ambient performance, large working space (1m-5m) and robustness to noise compared with conventional techniques. We compare with a state-of-the-art instrumented glove and vision-based systems to demonstrate the accuracy, scalability, and robustness of our approach. We propose a fast and accurate method for hand gesture modeling using an inverse kinematics approach with the six photosensitive markers. We additionally propose a passive markers system and demonstrate various interaction scenarios as practical applications.","PeriodicalId":412591,"journal":{"name":"Proceedings of the 14th European Conference on Visual Media Production (CVMP 2017)","volume":"85 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-12-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 14th European Conference on Visual Media Production (CVMP 2017)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3150165.3150169","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

Abstract

We present a low-cost 3D tracking system for virtual reality, gesture modeling, and robot manipulation applications which require fast and precise localization of headsets, data gloves, props, or controllers. Our system removes the need for cameras or projectors for sensing, and instead uses cheap LEDs and printed masks for illumination, and low-cost photosensitive markers. The illumination device transmits a spatiotemporal pattern as a series of binary Gray-code patterns. Multiple illumination devices can be combined to localize each marker in 3D at high speed (333Hz). Our method has strengths in accuracy, speed, cost, ambient performance, large working space (1m-5m) and robustness to noise compared with conventional techniques. We compare with a state-of-the-art instrumented glove and vision-based systems to demonstrate the accuracy, scalability, and robustness of our approach. We propose a fast and accurate method for hand gesture modeling using an inverse kinematics approach with the six photosensitive markers. We additionally propose a passive markers system and demonstrate various interaction scenarios as practical applications.
ThirdLight:使用光敏传感器标记的低成本和高速3D交互
我们提出了一种低成本的3D跟踪系统,用于虚拟现实,手势建模和机器人操作应用,这些应用需要快速精确地定位耳机,数据手套,道具或控制器。我们的系统不需要相机或投影仪来感应,而是使用廉价的led和印刷掩模来照明,以及低成本的光敏标记。照明装置以一系列二进制灰码模式传输时空模式。多个照明设备可以组合在一起,以高速(333Hz)在3D中定位每个标记。与传统方法相比,该方法具有精度高、速度快、成本低、环境性能好、工作空间大(1m-5m)、抗噪声强等优点。我们与最先进的仪器手套和基于视觉的系统进行比较,以证明我们方法的准确性、可扩展性和稳健性。我们提出了一种快速、准确的手势建模方法,使用六种光敏标记物的逆运动学方法。我们还提出了一个被动标记系统,并演示了各种交互场景作为实际应用。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信