Improved SIFT descriptor applied to stereo image matching

Luan Zeng, You Zhai, W. Xiong
{"title":"Improved SIFT descriptor applied to stereo image matching","authors":"Luan Zeng, You Zhai, W. Xiong","doi":"10.1117/12.2180667","DOIUrl":null,"url":null,"abstract":"Scale Invariant Feature Transform (SIFT) has been proven to perform better on the distinctiveness and robustness than other features. But it cannot satisfy the needs of low contrast images matching and the matching results are sensitive to 3D viewpoint change of camera. In order to improve the performance of SIFT to low contrast images and images with large 3D viewpoint change, a new matching method based on improved SIFT is proposed. First, an adaptive contrast threshold is computed for each initial key point in low contrast image region, which uses pixels in its 9×9 local neighborhood, and then using it to eliminate initial key points in low contrast image region. Second, a new SIFT descriptor with 48 dimensions is computed for each key point. Third, a hierarchical matching method based on epipolar line and differences of key points’ dominate orientation is presented. The experimental results prove that the method can greatly enhance the performance of SIFT to low contrast image matching. Besides, when applying it to stereo images matching with the hierarchical matching method, the correct matches and matching efficiency are greatly enhanced.","PeriodicalId":380636,"journal":{"name":"Precision Engineering Measurements and Instrumentation","volume":"19 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2015-03-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Precision Engineering Measurements and Instrumentation","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1117/12.2180667","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

Scale Invariant Feature Transform (SIFT) has been proven to perform better on the distinctiveness and robustness than other features. But it cannot satisfy the needs of low contrast images matching and the matching results are sensitive to 3D viewpoint change of camera. In order to improve the performance of SIFT to low contrast images and images with large 3D viewpoint change, a new matching method based on improved SIFT is proposed. First, an adaptive contrast threshold is computed for each initial key point in low contrast image region, which uses pixels in its 9×9 local neighborhood, and then using it to eliminate initial key points in low contrast image region. Second, a new SIFT descriptor with 48 dimensions is computed for each key point. Third, a hierarchical matching method based on epipolar line and differences of key points’ dominate orientation is presented. The experimental results prove that the method can greatly enhance the performance of SIFT to low contrast image matching. Besides, when applying it to stereo images matching with the hierarchical matching method, the correct matches and matching efficiency are greatly enhanced.
改进SIFT描述子用于立体图像匹配
尺度不变特征变换(SIFT)已被证明在显著性和鲁棒性方面优于其他特征。但它不能满足低对比度图像匹配的需要,且匹配结果对摄像机的三维视点变化很敏感。为了提高SIFT对低对比度图像和3D视点变化大的图像的匹配性能,提出了一种基于改进SIFT的匹配方法。首先,对低对比度图像区域的每个初始关键点计算自适应对比度阈值,利用其9×9局部邻域像素,然后利用该阈值消除低对比度图像区域的初始关键点;其次,为每个关键点计算一个新的48维SIFT描述符。第三,提出了一种基于极线和关键点主导方向差异的分层匹配方法。实验结果表明,该方法可以大大提高SIFT对低对比度图像的匹配性能。此外,将其应用于立体图像的分层匹配中,大大提高了匹配的正确率和匹配效率。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信