Augmented reality instruction for object assembly based on markerless tracking

Li-Chen Wu, I-Chen Lin, Ming-Han Tsai
{"title":"Augmented reality instruction for object assembly based on markerless tracking","authors":"Li-Chen Wu, I-Chen Lin, Ming-Han Tsai","doi":"10.1145/2856400.2856416","DOIUrl":null,"url":null,"abstract":"Conventional object assembly instructions are usually written or illustrated in a paper manual. Users have to associate these static instructions with real objects in 3D space. In this paper, a novel augmented reality system is presented for a user to interact with objects and instructions. While most related methods pasted obvious markers onto objects for tracking and constrained their orientations or shapes, we adopt a markerless strategy for more intuitive interaction. Based on live information from an off-the-shelf RGB-D camera, the proposed tracking procedure identifies components in a scene, tracks their 3D positions and orientations, and evaluates whether there are combinations of components. According to the detected events and poses, our indication procedure then dynamically displays indication lines, circular arrows and other hints to guide a user to manipulate the components into correct poses. The experiment shows that the proposed system can robustly track the components and respond intuitive instructions at an interactive rate. Most of users in evaluation are interested and willing to use this novel technique for object assembly.","PeriodicalId":207863,"journal":{"name":"Proceedings of the 20th ACM SIGGRAPH Symposium on Interactive 3D Graphics and Games","volume":"2 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-02-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"35","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 20th ACM SIGGRAPH Symposium on Interactive 3D Graphics and Games","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2856400.2856416","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 35

Abstract

Conventional object assembly instructions are usually written or illustrated in a paper manual. Users have to associate these static instructions with real objects in 3D space. In this paper, a novel augmented reality system is presented for a user to interact with objects and instructions. While most related methods pasted obvious markers onto objects for tracking and constrained their orientations or shapes, we adopt a markerless strategy for more intuitive interaction. Based on live information from an off-the-shelf RGB-D camera, the proposed tracking procedure identifies components in a scene, tracks their 3D positions and orientations, and evaluates whether there are combinations of components. According to the detected events and poses, our indication procedure then dynamically displays indication lines, circular arrows and other hints to guide a user to manipulate the components into correct poses. The experiment shows that the proposed system can robustly track the components and respond intuitive instructions at an interactive rate. Most of users in evaluation are interested and willing to use this novel technique for object assembly.
基于无标记跟踪的增强现实对象装配指导
传统的对象汇编指令通常写在纸质手册中。用户必须将这些静态指令与3D空间中的真实物体联系起来。本文提出了一种新的增强现实系统,使用户能够与物体和指令进行交互。大多数相关方法都是将明显的标记粘贴到对象上进行跟踪,并限制其方向或形状,而我们采用无标记策略以实现更直观的交互。基于来自现成RGB-D相机的实时信息,所提出的跟踪程序可以识别场景中的组件,跟踪它们的3D位置和方向,并评估组件是否存在组合。根据检测到的事件和姿态,我们的指示程序动态显示指示线、圆形箭头和其他提示,引导用户操作组件进入正确的姿态。实验表明,该系统能够鲁棒地跟踪部件,并以交互速率响应直观的指令。大多数评估用户都有兴趣并愿意使用这种新的对象组装技术。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信