地质统计学如何帮助我们理解深度学习?基于合成孔径雷达的飞机探测探索研究

IF 7.6 Q1 REMOTE SENSING
Lifu Chen , Zhenhuan Fang , Jin Xing , Xingmin Cai
{"title":"地质统计学如何帮助我们理解深度学习?基于合成孔径雷达的飞机探测探索研究","authors":"Lifu Chen ,&nbsp;Zhenhuan Fang ,&nbsp;Jin Xing ,&nbsp;Xingmin Cai","doi":"10.1016/j.jag.2024.104185","DOIUrl":null,"url":null,"abstract":"<div><div>Deep Neural Networks (DNNs) have garnered significant attention across various research domains due to their impressive performance, particularly Convolutional Neural Networks (CNNs), known for their exceptional accuracy in image processing tasks. However, the opaque nature of DNNs has raised concerns about their trustworthiness, as users often cannot understand how the model arrives at its predictions or decisions. This lack of transparency is particularly problematic in critical fields such as healthcare, finance, and law, where the stakes are high. Consequently, there has been a surge in the development of explanation methods for DNNs. Typically, the effectiveness of these methods is assessed subjectively via human observation on the heatmaps or attribution maps generated by eXplanation AI (XAI) methods. In this paper, a novel GeoStatistics Explainable Artificial Intelligence (GSEAI) framework is proposed, which integrates spatial pattern analysis from Geostatistics with XAI algorithms to assess and compare XAI understandability. Global and local Moran’s I indices, commonly used to assess the spatial autocorrelation of geographic data, assist in comprehending the spatial distribution patterns of attribution maps produced by the XAI method, through measuring the levels of aggregation or dispersion. Interpreting and analyzing attribution maps by Moran’s I scattergram and LISA clustering maps provide an accurate global objective quantitative assessment of the spatial distribution of feature attribution and achieves a more understandable local interpretation. In this paper, we conduct experiments on aircraft detection in SAR images based on the widely used YOLOv5 network, and evaluate four mainstream XAI methods quantitatively and qualitatively. By using GSEAI to perform explanation analysis of the given DNN, we could gain more insights about the behavior of the network, to enhance the trustworthiness of DNN applications. To the best of our knowledge, this is the first time XAI has been integrated with geostatistical algorithms in SAR domain knowledge, which expands the analytical approaches of XAI and also promotes the development of XAI within SAR image analytics.</div></div>","PeriodicalId":73423,"journal":{"name":"International journal of applied earth observation and geoinformation : ITC journal","volume":"134 ","pages":"Article 104185"},"PeriodicalIF":7.6000,"publicationDate":"2024-10-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"How can geostatistics help us understand deep learning? An exploratory study in SAR-based aircraft detection\",\"authors\":\"Lifu Chen ,&nbsp;Zhenhuan Fang ,&nbsp;Jin Xing ,&nbsp;Xingmin Cai\",\"doi\":\"10.1016/j.jag.2024.104185\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Deep Neural Networks (DNNs) have garnered significant attention across various research domains due to their impressive performance, particularly Convolutional Neural Networks (CNNs), known for their exceptional accuracy in image processing tasks. However, the opaque nature of DNNs has raised concerns about their trustworthiness, as users often cannot understand how the model arrives at its predictions or decisions. This lack of transparency is particularly problematic in critical fields such as healthcare, finance, and law, where the stakes are high. Consequently, there has been a surge in the development of explanation methods for DNNs. Typically, the effectiveness of these methods is assessed subjectively via human observation on the heatmaps or attribution maps generated by eXplanation AI (XAI) methods. In this paper, a novel GeoStatistics Explainable Artificial Intelligence (GSEAI) framework is proposed, which integrates spatial pattern analysis from Geostatistics with XAI algorithms to assess and compare XAI understandability. Global and local Moran’s I indices, commonly used to assess the spatial autocorrelation of geographic data, assist in comprehending the spatial distribution patterns of attribution maps produced by the XAI method, through measuring the levels of aggregation or dispersion. Interpreting and analyzing attribution maps by Moran’s I scattergram and LISA clustering maps provide an accurate global objective quantitative assessment of the spatial distribution of feature attribution and achieves a more understandable local interpretation. In this paper, we conduct experiments on aircraft detection in SAR images based on the widely used YOLOv5 network, and evaluate four mainstream XAI methods quantitatively and qualitatively. By using GSEAI to perform explanation analysis of the given DNN, we could gain more insights about the behavior of the network, to enhance the trustworthiness of DNN applications. To the best of our knowledge, this is the first time XAI has been integrated with geostatistical algorithms in SAR domain knowledge, which expands the analytical approaches of XAI and also promotes the development of XAI within SAR image analytics.</div></div>\",\"PeriodicalId\":73423,\"journal\":{\"name\":\"International journal of applied earth observation and geoinformation : ITC journal\",\"volume\":\"134 \",\"pages\":\"Article 104185\"},\"PeriodicalIF\":7.6000,\"publicationDate\":\"2024-10-14\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International journal of applied earth observation and geoinformation : ITC journal\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S1569843224005417\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"REMOTE SENSING\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International journal of applied earth observation and geoinformation : ITC journal","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1569843224005417","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"REMOTE SENSING","Score":null,"Total":0}
引用次数: 0

摘要

深度神经网络(DNN)因其令人印象深刻的性能而在各个研究领域备受关注,尤其是卷积神经网络(CNN),因其在图像处理任务中的卓越准确性而闻名。然而,卷积神经网络的不透明性引发了人们对其可信度的担忧,因为用户往往无法理解模型是如何得出预测或决策的。在医疗保健、金融和法律等利害关系重大的领域,这种缺乏透明度的问题尤为突出。因此,为 DNN 开发解释方法的热潮已经兴起。通常情况下,这些方法的有效性是通过人类对 eXplanation AI(XAI)方法生成的热图或归因图的观察进行主观评估的。本文提出了一个新颖的 GeoStatistics 可解释人工智能(GSEAI)框架,该框架将 Geostatistics 的空间模式分析与 XAI 算法相结合,以评估和比较 XAI 的可理解性。全局和局部莫兰 I 指数通常用于评估地理数据的空间自相关性,通过测量聚集或分散程度,有助于理解 XAI 方法生成的归因图的空间分布模式。通过 Moran's I 散点图和 LISA 聚类图来解释和分析归属图,可对特征归属的空间分布进行准确的全局客观定量评估,并实现更易于理解的局部解释。本文基于广泛使用的 YOLOv5 网络,对 SAR 图像中的飞机检测进行了实验,并对四种主流 XAI 方法进行了定量和定性评估。通过使用 GSEAI 对给定的 DNN 进行解释分析,我们可以获得更多关于网络行为的见解,从而提高 DNN 应用的可信度。据我们所知,这是 XAI 首次与合成孔径雷达领域知识中的地质统计算法相结合,拓展了 XAI 的分析方法,也促进了 XAI 在合成孔径雷达图像分析中的发展。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
How can geostatistics help us understand deep learning? An exploratory study in SAR-based aircraft detection
Deep Neural Networks (DNNs) have garnered significant attention across various research domains due to their impressive performance, particularly Convolutional Neural Networks (CNNs), known for their exceptional accuracy in image processing tasks. However, the opaque nature of DNNs has raised concerns about their trustworthiness, as users often cannot understand how the model arrives at its predictions or decisions. This lack of transparency is particularly problematic in critical fields such as healthcare, finance, and law, where the stakes are high. Consequently, there has been a surge in the development of explanation methods for DNNs. Typically, the effectiveness of these methods is assessed subjectively via human observation on the heatmaps or attribution maps generated by eXplanation AI (XAI) methods. In this paper, a novel GeoStatistics Explainable Artificial Intelligence (GSEAI) framework is proposed, which integrates spatial pattern analysis from Geostatistics with XAI algorithms to assess and compare XAI understandability. Global and local Moran’s I indices, commonly used to assess the spatial autocorrelation of geographic data, assist in comprehending the spatial distribution patterns of attribution maps produced by the XAI method, through measuring the levels of aggregation or dispersion. Interpreting and analyzing attribution maps by Moran’s I scattergram and LISA clustering maps provide an accurate global objective quantitative assessment of the spatial distribution of feature attribution and achieves a more understandable local interpretation. In this paper, we conduct experiments on aircraft detection in SAR images based on the widely used YOLOv5 network, and evaluate four mainstream XAI methods quantitatively and qualitatively. By using GSEAI to perform explanation analysis of the given DNN, we could gain more insights about the behavior of the network, to enhance the trustworthiness of DNN applications. To the best of our knowledge, this is the first time XAI has been integrated with geostatistical algorithms in SAR domain knowledge, which expands the analytical approaches of XAI and also promotes the development of XAI within SAR image analytics.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
International journal of applied earth observation and geoinformation : ITC journal
International journal of applied earth observation and geoinformation : ITC journal Global and Planetary Change, Management, Monitoring, Policy and Law, Earth-Surface Processes, Computers in Earth Sciences
CiteScore
12.00
自引率
0.00%
发文量
0
审稿时长
77 days
期刊介绍: The International Journal of Applied Earth Observation and Geoinformation publishes original papers that utilize earth observation data for natural resource and environmental inventory and management. These data primarily originate from remote sensing platforms, including satellites and aircraft, supplemented by surface and subsurface measurements. Addressing natural resources such as forests, agricultural land, soils, and water, as well as environmental concerns like biodiversity, land degradation, and hazards, the journal explores conceptual and data-driven approaches. It covers geoinformation themes like capturing, databasing, visualization, interpretation, data quality, and spatial uncertainty.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信