用于无标记增强现实的实时跟踪器

Andrew I. Comport, É. Marchand, F. Chaumette
{"title":"用于无标记增强现实的实时跟踪器","authors":"Andrew I. Comport, É. Marchand, F. Chaumette","doi":"10.1109/ISMAR.2003.1240686","DOIUrl":null,"url":null,"abstract":"Augmented reality has now progressed to the point where real-time applications are required and being considered. At the same time it is important that synthetic elements are rendered and aligned in the scene in an accurate and visually acceptable way. In order to address these issues a real-time, robust and efficient 3D model-based tracking algorithm is proposed for a 'video see through' monocular vision system. The tracking of objects in the scene amounts to calculating the pose between the camera and the objects. Virtual objects can then be projected into the scene using the pose. Here, non-linear pose computation is formulated by means of a virtual visual servoing approach. In this context, the derivation of point-to-curve interaction matrices is given for different features including lines, circles, cylinders and spheres. A local moving edge tracker is used in order to provide real-time tracking of points normal to the object contours. A method is proposed for combining local position uncertainty and global pose uncertainty in an efficient and accurate way by propagating uncertainty. Robustness is obtained by integrating an M-estimator into the visual control law via an iteratively re-weighted least squares implementation. The method presented in this paper has been validated on several complex image sequences including outdoor environments. Results show the method to be robust to occlusion, changes in illumination and mistracking.","PeriodicalId":296266,"journal":{"name":"The Second IEEE and ACM International Symposium on Mixed and Augmented Reality, 2003. Proceedings.","volume":"65 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2003-10-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"237","resultStr":"{\"title\":\"A real-time tracker for markerless augmented reality\",\"authors\":\"Andrew I. Comport, É. Marchand, F. Chaumette\",\"doi\":\"10.1109/ISMAR.2003.1240686\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Augmented reality has now progressed to the point where real-time applications are required and being considered. At the same time it is important that synthetic elements are rendered and aligned in the scene in an accurate and visually acceptable way. In order to address these issues a real-time, robust and efficient 3D model-based tracking algorithm is proposed for a 'video see through' monocular vision system. The tracking of objects in the scene amounts to calculating the pose between the camera and the objects. Virtual objects can then be projected into the scene using the pose. Here, non-linear pose computation is formulated by means of a virtual visual servoing approach. In this context, the derivation of point-to-curve interaction matrices is given for different features including lines, circles, cylinders and spheres. A local moving edge tracker is used in order to provide real-time tracking of points normal to the object contours. A method is proposed for combining local position uncertainty and global pose uncertainty in an efficient and accurate way by propagating uncertainty. Robustness is obtained by integrating an M-estimator into the visual control law via an iteratively re-weighted least squares implementation. The method presented in this paper has been validated on several complex image sequences including outdoor environments. Results show the method to be robust to occlusion, changes in illumination and mistracking.\",\"PeriodicalId\":296266,\"journal\":{\"name\":\"The Second IEEE and ACM International Symposium on Mixed and Augmented Reality, 2003. Proceedings.\",\"volume\":\"65 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2003-10-07\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"237\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"The Second IEEE and ACM International Symposium on Mixed and Augmented Reality, 2003. Proceedings.\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ISMAR.2003.1240686\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"The Second IEEE and ACM International Symposium on Mixed and Augmented Reality, 2003. Proceedings.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISMAR.2003.1240686","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 237

摘要

增强现实现在已经发展到需要实时应用程序并被考虑的地步。同时,合成元素在场景中以准确和视觉上可接受的方式进行渲染和对齐也是很重要的。为了解决这些问题,提出了一种用于“视频透视”单目视觉系统的实时、鲁棒和高效的基于3D模型的跟踪算法。场景中物体的跟踪相当于计算相机与物体之间的姿态。然后,虚拟物体可以使用姿势投射到场景中。在这里,非线性位姿计算是通过虚拟视觉伺服方法来实现的。在这种情况下,给出了不同特征点-曲线相互作用矩阵的推导,包括线、圆、柱面和球。局部移动边缘跟踪器是为了提供实时跟踪点的法向目标轮廓。提出了一种通过传播不确定性将局部位置不确定性和全局位姿不确定性有效、准确地结合起来的方法。鲁棒性是通过迭代加权最小二乘实现将m估计量集成到视觉控制律中来获得的。本文提出的方法已在包括室外环境在内的多个复杂图像序列上进行了验证。结果表明,该方法对遮挡、光照变化和误跟踪具有较强的鲁棒性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
A real-time tracker for markerless augmented reality
Augmented reality has now progressed to the point where real-time applications are required and being considered. At the same time it is important that synthetic elements are rendered and aligned in the scene in an accurate and visually acceptable way. In order to address these issues a real-time, robust and efficient 3D model-based tracking algorithm is proposed for a 'video see through' monocular vision system. The tracking of objects in the scene amounts to calculating the pose between the camera and the objects. Virtual objects can then be projected into the scene using the pose. Here, non-linear pose computation is formulated by means of a virtual visual servoing approach. In this context, the derivation of point-to-curve interaction matrices is given for different features including lines, circles, cylinders and spheres. A local moving edge tracker is used in order to provide real-time tracking of points normal to the object contours. A method is proposed for combining local position uncertainty and global pose uncertainty in an efficient and accurate way by propagating uncertainty. Robustness is obtained by integrating an M-estimator into the visual control law via an iteratively re-weighted least squares implementation. The method presented in this paper has been validated on several complex image sequences including outdoor environments. Results show the method to be robust to occlusion, changes in illumination and mistracking.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信