Velocity-Based Correspondence in Stereokinetic Images

Cornilleauperes V., Droulez J.
{"title":"Velocity-Based Correspondence in Stereokinetic Images","authors":"Cornilleauperes V.,&nbsp;Droulez J.","doi":"10.1006/ciun.1993.1034","DOIUrl":null,"url":null,"abstract":"<div><p>This paper explores the possibility of using the binocular optic flow as an input for the correspondence process between stereoscopic images. The main advantage of the stereocorrespondence from optic flow (SCOF) is that it does not require the use of any a priori hypothesis concerning the 3D object under analysis. In order to determine its performance relative to noisy data, we applied an algorithm of SCOF on different rigid surfaces undertaking various 3D motions. We found that when SCOF is possible it is rather robust to noise. Moreover, the study of its domain of optimal efficiency shows that SCOF is likely to cooperate well with static stereopsis or structure from motion algorithms, thereby strengthening the processing of dynamic stereo images. As far as human vision is concerned, our psychophysical results indicate that a SCOF process does not seem to be used in the perception of 3D structure. This could be accounted for by the poor contribution of convergence signals to the perception of absolute depth in human vision, which seems incompatible with the precise knowledge of the geometry of the viewing system required by the SCOF.</p></div>","PeriodicalId":100350,"journal":{"name":"CVGIP: Image Understanding","volume":"58 2","pages":"Pages 137-146"},"PeriodicalIF":0.0000,"publicationDate":"1993-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1006/ciun.1993.1034","citationCount":"8","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"CVGIP: Image Understanding","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S104996608371034X","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 8

Abstract

This paper explores the possibility of using the binocular optic flow as an input for the correspondence process between stereoscopic images. The main advantage of the stereocorrespondence from optic flow (SCOF) is that it does not require the use of any a priori hypothesis concerning the 3D object under analysis. In order to determine its performance relative to noisy data, we applied an algorithm of SCOF on different rigid surfaces undertaking various 3D motions. We found that when SCOF is possible it is rather robust to noise. Moreover, the study of its domain of optimal efficiency shows that SCOF is likely to cooperate well with static stereopsis or structure from motion algorithms, thereby strengthening the processing of dynamic stereo images. As far as human vision is concerned, our psychophysical results indicate that a SCOF process does not seem to be used in the perception of 3D structure. This could be accounted for by the poor contribution of convergence signals to the perception of absolute depth in human vision, which seems incompatible with the precise knowledge of the geometry of the viewing system required by the SCOF.

立体动力学图像中基于速度的对应
本文探讨了利用双目光流作为立体图像对应过程输入的可能性。光流的立体对应(SCOF)的主要优点是,它不需要使用任何先验假设的三维对象的分析。为了确定其相对于噪声数据的性能,我们在进行各种三维运动的不同刚性表面上应用了一种SCOF算法。我们发现,当SCOF是可能的,它是相当稳健的噪声。此外,对其最优效率域的研究表明,SCOF可能与静态立体视觉或运动结构算法很好地配合,从而加强对动态立体图像的处理。就人类视觉而言,我们的心理物理结果表明,SCOF过程似乎并未用于三维结构的感知。这可能是由于收敛信号对人类视觉绝对深度感知的不良贡献,这似乎与SCOF所需的观看系统几何形状的精确知识不相容。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信