{"title":"A Novel Unsupervised Evaluation Metric for SAR Image Segmentation Results","authors":"Hang Yu, X. Yin, Zhiheng Liu, Zichuan Xie, Suiping Zhou, Yuru Guo","doi":"10.1109/ICGMRS55602.2022.9849399","DOIUrl":null,"url":null,"abstract":"The segmentation of Synthetic aperture radar (SAR) images is a critical step in remote sensing image analysis. Evaluating the performance of segmentation without ground truth data, i.e., unsupervised evaluation (UE) is essential for comparing segmentation algorithms and the automatic selection of optimal parameters. The ground truth used in the supervised evaluation (SE) metric is highly subjective, and the ground truth of SAR images is hard to obtain. The current UE metrics only depend on a single feature, and it fails for the segmentation results of SAR images containing multiple heterogeneous features. This study proposes a novel UE method to quantitatively measure the quality of SAR image segmentation results to overcome these problems. In this method, gray and texture features are captured firstly, and the two elements of each segment are fused to the covariance matrix of a segment. Secondly, using the covariance matrix calculates the intra-segment homogeneity and inter-segment heterogeneity of the segmentation results. Finally, a single metric combines these metrics, and a global criterion combines these single segment metrics to reveal the segmentation results quality. The method is tested on three segmentation algorithms and ten images. The proposed method is compared with existing UE methods and a SE method to confirm its capabilities. Through comparison, the results verified the effectiveness of the proposed metric and demonstrated the reliability and improvements of proposed method concerning other methods.","PeriodicalId":129909,"journal":{"name":"2022 3rd International Conference on Geology, Mapping and Remote Sensing (ICGMRS)","volume":"11 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-04-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 3rd International Conference on Geology, Mapping and Remote Sensing (ICGMRS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICGMRS55602.2022.9849399","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
The segmentation of Synthetic aperture radar (SAR) images is a critical step in remote sensing image analysis. Evaluating the performance of segmentation without ground truth data, i.e., unsupervised evaluation (UE) is essential for comparing segmentation algorithms and the automatic selection of optimal parameters. The ground truth used in the supervised evaluation (SE) metric is highly subjective, and the ground truth of SAR images is hard to obtain. The current UE metrics only depend on a single feature, and it fails for the segmentation results of SAR images containing multiple heterogeneous features. This study proposes a novel UE method to quantitatively measure the quality of SAR image segmentation results to overcome these problems. In this method, gray and texture features are captured firstly, and the two elements of each segment are fused to the covariance matrix of a segment. Secondly, using the covariance matrix calculates the intra-segment homogeneity and inter-segment heterogeneity of the segmentation results. Finally, a single metric combines these metrics, and a global criterion combines these single segment metrics to reveal the segmentation results quality. The method is tested on three segmentation algorithms and ten images. The proposed method is compared with existing UE methods and a SE method to confirm its capabilities. Through comparison, the results verified the effectiveness of the proposed metric and demonstrated the reliability and improvements of proposed method concerning other methods.