Coarse-to-Fine Accurate Registration for Airborne SAR Images Using SAR-Fast and DSP-LATCH

IF 6.7 1区 计算机科学 Q1 Physics and Astronomy
Huai Yu, Wen Yang, Yan Liu
{"title":"Coarse-to-Fine Accurate Registration for Airborne SAR Images Using SAR-Fast and DSP-LATCH","authors":"Huai Yu, Wen Yang, Yan Liu","doi":"10.2528/PIER18070801","DOIUrl":null,"url":null,"abstract":"Synthetic Aperture Radar (SAR) image registration is to establish reliable correspondences among the images of the same scene. It is a challenging problem to register the airborne SAR images for the instability of airborne SAR systems and the lack of appropriate geo-reference data. Besides, techniques for registering satellite-based SAR images relying on rigorous SAR geocoding cannot be directly applied to airborne SAR images. To address this problem, we present a coarse-to-fine registration method for airborne SAR images by combining SAR-FAST (Features from Accelerated Segment Test) feature detector and DSP-LATCH (Domain-Size Pooling of Learned Arrangements of Three patCH) feature descriptor, which only relies on the gray level intensity of SAR data. More precisely, we first apply SAR-FAST, which is an adapted version of FAST for analyzing SAR images, to detect corners with high accuracy and low computational complexity. To reduce the disturbance of speckle noise as well as to achieve efficient and discriminative feature description, we further propose an improved descriptor named DSP-LATCH to describe the features, which combines the Domain-size Pooling scheme of DSP-SIFT (Scale-Invariant Feature Transform) and the idea of comparing triplets of patches rather than individual pixel values of LATCH. Finally, we conduct a coarse-to-fine strategy for SAR image registration by employing binary feature matching and the Powell algorithm. Compared with the existing feature based SAR image registration methods, e.g., SIFT and its variants, our method yields more reliable matched feature points and achieves higher registration accuracy. The experimental results on different scenes of airborne SAR images demonstrate the superiority of the proposed method in terms of robustness and accuracy.","PeriodicalId":54551,"journal":{"name":"Progress in Electromagnetics Research-Pier","volume":"198 1","pages":"89-106"},"PeriodicalIF":6.7000,"publicationDate":"2018-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Progress in Electromagnetics Research-Pier","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.2528/PIER18070801","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"Physics and Astronomy","Score":null,"Total":0}
引用次数: 3

Abstract

Synthetic Aperture Radar (SAR) image registration is to establish reliable correspondences among the images of the same scene. It is a challenging problem to register the airborne SAR images for the instability of airborne SAR systems and the lack of appropriate geo-reference data. Besides, techniques for registering satellite-based SAR images relying on rigorous SAR geocoding cannot be directly applied to airborne SAR images. To address this problem, we present a coarse-to-fine registration method for airborne SAR images by combining SAR-FAST (Features from Accelerated Segment Test) feature detector and DSP-LATCH (Domain-Size Pooling of Learned Arrangements of Three patCH) feature descriptor, which only relies on the gray level intensity of SAR data. More precisely, we first apply SAR-FAST, which is an adapted version of FAST for analyzing SAR images, to detect corners with high accuracy and low computational complexity. To reduce the disturbance of speckle noise as well as to achieve efficient and discriminative feature description, we further propose an improved descriptor named DSP-LATCH to describe the features, which combines the Domain-size Pooling scheme of DSP-SIFT (Scale-Invariant Feature Transform) and the idea of comparing triplets of patches rather than individual pixel values of LATCH. Finally, we conduct a coarse-to-fine strategy for SAR image registration by employing binary feature matching and the Powell algorithm. Compared with the existing feature based SAR image registration methods, e.g., SIFT and its variants, our method yields more reliable matched feature points and achieves higher registration accuracy. The experimental results on different scenes of airborne SAR images demonstrate the superiority of the proposed method in terms of robustness and accuracy.
基于SAR- fast和DSP-LATCH的机载SAR图像粗到精精确配准
合成孔径雷达(SAR)图像配准是为了在同一场景的图像之间建立可靠的对应关系。由于机载SAR系统的不稳定性和缺乏合适的地理参考数据,机载SAR图像的配准是一个具有挑战性的问题。此外,依赖于严格的SAR地理编码的星载SAR图像配准技术不能直接应用于机载SAR图像。为了解决这一问题,本文提出了一种基于SAR数据灰度强度的机载SAR图像粗到精配准方法,该方法将SAR- fast(来自加速片段测试的特征)特征检测器和DSP-LATCH(三个patCH的学习排列的域大小池)特征描述符相结合。更准确地说,我们首先应用了SAR-FAST,这是用于分析SAR图像的FAST的改编版本,以高精度和低计算复杂度检测拐角。为了减少散斑噪声的干扰,实现高效、有区别的特征描述,我们进一步提出了一种改进的描述符DSP-LATCH来描述特征,该描述符结合了DSP-SIFT (Scale-Invariant feature Transform)的Domain-size Pooling方案和LATCH比较patch的三元组而不是单个像素值的思想。最后,利用二值特征匹配和Powell算法实现了SAR图像的粗到精配准策略。与现有的基于特征的SAR图像配准方法(如SIFT及其变体)相比,该方法可以获得更可靠的匹配特征点,配准精度更高。在不同场景的机载SAR图像上的实验结果表明了该方法在鲁棒性和精度方面的优越性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
7.20
自引率
3.00%
发文量
0
审稿时长
1.3 months
期刊介绍: Progress In Electromagnetics Research (PIER) publishes peer-reviewed original and comprehensive articles on all aspects of electromagnetic theory and applications. This is an open access, on-line journal PIER (E-ISSN 1559-8985). It has been first published as a monograph series on Electromagnetic Waves (ISSN 1070-4698) in 1989. It is freely available to all readers via the Internet.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信