HR-SAR-Net: A Deep Neural Network for Urban Scene Segmentation from High-Resolution SAR Data

Xiaying Wang, L. Cavigelli, M. Eggimann, M. Magno, L. Benini
{"title":"HR-SAR-Net: A Deep Neural Network for Urban Scene Segmentation from High-Resolution SAR Data","authors":"Xiaying Wang, L. Cavigelli, M. Eggimann, M. Magno, L. Benini","doi":"10.1109/SAS48726.2020.9220068","DOIUrl":null,"url":null,"abstract":"Synthetic aperture radar (SAR) data is becoming increasingly available to a wide range of users through commercial service providers with resolutions reaching 0.5 m/px. Segmenting SAR data still requires skilled personnel, limiting the potential for large-scale use. We show that it is possible to automatically and reliably perform urban scene segmentation from next-gen resolution SAR data (0.15 m/px) using deep neural networks (DNNs), achieving a pixel accuracy of 95.19% and a mean intersection-over-union (mIoU) of 74.67% with data collected over a region of merely 2.2km2. The presented DNN is not only effective, but is very small with only 63k parameters and computationally simple enough to achieve a throughput of around 500 Mpx/s using a single GPU. We further identify that additional SAR receive antennas and data from multiple flights massively improve the segmentation accuracy. We describe a procedure for generating a high-quality segmentation ground truth from multiple inaccurate building and road annotations, which has been crucial to achieving these segmentation results.","PeriodicalId":223737,"journal":{"name":"2020 IEEE Sensors Applications Symposium (SAS)","volume":"23 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-12-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"16","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 IEEE Sensors Applications Symposium (SAS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SAS48726.2020.9220068","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 16

Abstract

Synthetic aperture radar (SAR) data is becoming increasingly available to a wide range of users through commercial service providers with resolutions reaching 0.5 m/px. Segmenting SAR data still requires skilled personnel, limiting the potential for large-scale use. We show that it is possible to automatically and reliably perform urban scene segmentation from next-gen resolution SAR data (0.15 m/px) using deep neural networks (DNNs), achieving a pixel accuracy of 95.19% and a mean intersection-over-union (mIoU) of 74.67% with data collected over a region of merely 2.2km2. The presented DNN is not only effective, but is very small with only 63k parameters and computationally simple enough to achieve a throughput of around 500 Mpx/s using a single GPU. We further identify that additional SAR receive antennas and data from multiple flights massively improve the segmentation accuracy. We describe a procedure for generating a high-quality segmentation ground truth from multiple inaccurate building and road annotations, which has been crucial to achieving these segmentation results.
HR-SAR-Net:基于高分辨率SAR数据的城市场景分割深度神经网络
合成孔径雷达(SAR)数据越来越多地通过商业服务提供商提供给广泛的用户,分辨率达到0.5 m/px。分割SAR数据仍然需要熟练的人员,限制了大规模使用的潜力。我们表明,使用深度神经网络(dnn)可以自动可靠地从次一代分辨率SAR数据(0.15 m/px)中进行城市场景分割,在仅2.2km2的区域内收集数据,实现了95.19%的像素精度和74.67%的平均交叉-过联合(mIoU)。所提出的深度神经网络不仅有效,而且非常小,只有63k个参数,计算简单到足以使用单个GPU实现约500 Mpx/s的吞吐量。我们进一步发现,额外的SAR接收天线和来自多个航班的数据大大提高了分割精度。我们描述了一种从多个不准确的建筑和道路注释中生成高质量分割地面真值的过程,这对于实现这些分割结果至关重要。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信