Multi-step stereo matching algorithm based on image fragment and confidence propagation

Yingjiang Li, Yaping Deng, Yuzhong Zhong
{"title":"Multi-step stereo matching algorithm based on image fragment and confidence propagation","authors":"Yingjiang Li, Yaping Deng, Yuzhong Zhong","doi":"10.1109/CCISP55629.2022.9974317","DOIUrl":null,"url":null,"abstract":"This paper presents a multi-step stereo matching algorithm that can be applied to multiple scenes. To adapt to different application scenarios, the algorithm divides the stereo matching process into three steps: point, fragment, and plane. First, the texture points of an image are extracted and the stereo matching (point disparity) of these points is performed using the improved self-aware matching measure (SAMM) algorithm. Then, according to the edge information of the image, a smooth region is divided into different fragments in the horizontal direction. The disparity estimation of the smooth region (segment disparity) is obtained through the confidence propagation of disparity values of texture points in the fragments. Finally, based on the similarity of plane disparity, a disparity map is generated using the disparity refining algorithm (plane disparity), and a final high-precision disparity value is obtained. The experimental results show that the proposed algorithm has high operational efficiency and accurate disparity estimation. Moreover, the algorithm may be adapted for more application scenarios.","PeriodicalId":431851,"journal":{"name":"2022 7th International Conference on Communication, Image and Signal Processing (CCISP)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 7th International Conference on Communication, Image and Signal Processing (CCISP)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CCISP55629.2022.9974317","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

This paper presents a multi-step stereo matching algorithm that can be applied to multiple scenes. To adapt to different application scenarios, the algorithm divides the stereo matching process into three steps: point, fragment, and plane. First, the texture points of an image are extracted and the stereo matching (point disparity) of these points is performed using the improved self-aware matching measure (SAMM) algorithm. Then, according to the edge information of the image, a smooth region is divided into different fragments in the horizontal direction. The disparity estimation of the smooth region (segment disparity) is obtained through the confidence propagation of disparity values of texture points in the fragments. Finally, based on the similarity of plane disparity, a disparity map is generated using the disparity refining algorithm (plane disparity), and a final high-precision disparity value is obtained. The experimental results show that the proposed algorithm has high operational efficiency and accurate disparity estimation. Moreover, the algorithm may be adapted for more application scenarios.
基于图像片段和置信度传播的多步立体匹配算法
提出了一种适用于多场景的多步立体匹配算法。为了适应不同的应用场景,该算法将立体匹配过程分为三点:点、片段、平面。首先,提取图像的纹理点,利用改进的自感知匹配测度(SAMM)算法对这些点进行立体匹配(点视差);然后,根据图像的边缘信息,在水平方向上将光滑区域划分为不同的碎片。通过对碎片中纹理点的视差值进行置信度传播,得到光滑区域(片段视差)的视差估计。最后,基于平面视差的相似性,利用视差细化算法(平面视差)生成视差图,得到最终的高精度视差值。实验结果表明,该算法运算效率高,视差估计准确。此外,该算法可以适应更多的应用场景。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信