基于因果学习的语义分割稳健珊瑚健康状态识别

IF 12.2 1区 地球科学 Q1 GEOGRAPHY, PHYSICAL
Jiangying Qin , Ming Li , Deren Li , Armin Gruen , Jianya Gong , Xuan Liao
{"title":"基于因果学习的语义分割稳健珊瑚健康状态识别","authors":"Jiangying Qin ,&nbsp;Ming Li ,&nbsp;Deren Li ,&nbsp;Armin Gruen ,&nbsp;Jianya Gong ,&nbsp;Xuan Liao","doi":"10.1016/j.isprsjprs.2025.08.009","DOIUrl":null,"url":null,"abstract":"<div><div>Global warming is accelerating the degradation of coral reef ecosystems, making accurate monitoring of coral reef health status crucial for their protection and restoration. Traditional coral reef remote sensing monitoring primarily relies on satellite or aerial observations, which provide broad spatial coverage but lack the fine-grained capability needed to capture the detailed structure and health status of individual coral colonies. In contrast, underwater photography utilizes close-range, high-resolution image-based observation, which can be considered a non-traditional form of remote sensing, to enable fine-grained assessment of corals with varying health status at pixel level. In this context, underwater image semantic segmentation plays a vital role by extracting discriminative visual features from complex underwater imaging scenes and enabling the automated classification and identification of different coral health status, based on expert-annotated labels. This semantic information can then be used to derive corresponding ecological indicators. While deep learning-based coral image segmentation methods have been proven effective for underwater coral remote sensing monitoring tasks, challenges remain regarding their generalization ability across diverse monitoring scenarios. These challenges stem from shifts in coral image data distributions and the inherent data-driven nature of deep learning models. In this study, we introduce causal learning into coral image segmentation for the first time and propose CDNet, a novel causal-driven semantic segmentation framework designed to robustly identify multiple coral health states — live, dead, and bleached — from imagery in complex and dynamic underwater environments. Specifically, we introduce a Causal Decorrelation Module to reduce spurious correlations within irrelevant features, ensuring that the network can focus on the intrinsic causal features of different coral health status. Additionally, an Enhanced Feature Aggregation Module is proposed to improve the model’s ability to capture multi-scale details and complex boundaries. Extensive experiments demonstrate that CDNet achieves consistently high segmentation performance, with an average mF1 score exceeding 60% across datasets from diverse temporal and spatial domains. Compare to state-of-the-art methods, its mIoU improves by 4.3% to 40%. Moreover, CDNet maintains accurate and consistent segmentation performance under simulated scenarios reflecting practical underwater coral remote sensing monitoring challenges (including internal geometric transformations, variations in external environments, and different contextual dependencies), as well as on diverse real-world underwater coral datasets. Our proposed method provides a reliable and scalable solution for accurate and rapid spatiotemporal monitoring of coral reefs, offering practical value for long-term conservation and climate resilience of coral reefs.</div></div>","PeriodicalId":50269,"journal":{"name":"ISPRS Journal of Photogrammetry and Remote Sensing","volume":"229 ","pages":"Pages 78-91"},"PeriodicalIF":12.2000,"publicationDate":"2025-08-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Causal learning-driven semantic segmentation for robust coral health status identification\",\"authors\":\"Jiangying Qin ,&nbsp;Ming Li ,&nbsp;Deren Li ,&nbsp;Armin Gruen ,&nbsp;Jianya Gong ,&nbsp;Xuan Liao\",\"doi\":\"10.1016/j.isprsjprs.2025.08.009\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Global warming is accelerating the degradation of coral reef ecosystems, making accurate monitoring of coral reef health status crucial for their protection and restoration. Traditional coral reef remote sensing monitoring primarily relies on satellite or aerial observations, which provide broad spatial coverage but lack the fine-grained capability needed to capture the detailed structure and health status of individual coral colonies. In contrast, underwater photography utilizes close-range, high-resolution image-based observation, which can be considered a non-traditional form of remote sensing, to enable fine-grained assessment of corals with varying health status at pixel level. In this context, underwater image semantic segmentation plays a vital role by extracting discriminative visual features from complex underwater imaging scenes and enabling the automated classification and identification of different coral health status, based on expert-annotated labels. This semantic information can then be used to derive corresponding ecological indicators. While deep learning-based coral image segmentation methods have been proven effective for underwater coral remote sensing monitoring tasks, challenges remain regarding their generalization ability across diverse monitoring scenarios. These challenges stem from shifts in coral image data distributions and the inherent data-driven nature of deep learning models. In this study, we introduce causal learning into coral image segmentation for the first time and propose CDNet, a novel causal-driven semantic segmentation framework designed to robustly identify multiple coral health states — live, dead, and bleached — from imagery in complex and dynamic underwater environments. Specifically, we introduce a Causal Decorrelation Module to reduce spurious correlations within irrelevant features, ensuring that the network can focus on the intrinsic causal features of different coral health status. Additionally, an Enhanced Feature Aggregation Module is proposed to improve the model’s ability to capture multi-scale details and complex boundaries. Extensive experiments demonstrate that CDNet achieves consistently high segmentation performance, with an average mF1 score exceeding 60% across datasets from diverse temporal and spatial domains. Compare to state-of-the-art methods, its mIoU improves by 4.3% to 40%. Moreover, CDNet maintains accurate and consistent segmentation performance under simulated scenarios reflecting practical underwater coral remote sensing monitoring challenges (including internal geometric transformations, variations in external environments, and different contextual dependencies), as well as on diverse real-world underwater coral datasets. Our proposed method provides a reliable and scalable solution for accurate and rapid spatiotemporal monitoring of coral reefs, offering practical value for long-term conservation and climate resilience of coral reefs.</div></div>\",\"PeriodicalId\":50269,\"journal\":{\"name\":\"ISPRS Journal of Photogrammetry and Remote Sensing\",\"volume\":\"229 \",\"pages\":\"Pages 78-91\"},\"PeriodicalIF\":12.2000,\"publicationDate\":\"2025-08-26\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"ISPRS Journal of Photogrammetry and Remote Sensing\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0924271625003211\",\"RegionNum\":1,\"RegionCategory\":\"地球科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"GEOGRAPHY, PHYSICAL\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"ISPRS Journal of Photogrammetry and Remote Sensing","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0924271625003211","RegionNum":1,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"GEOGRAPHY, PHYSICAL","Score":null,"Total":0}
引用次数: 0

摘要

全球变暖正在加速珊瑚礁生态系统的退化,因此准确监测珊瑚礁的健康状况对珊瑚礁的保护和恢复至关重要。传统的珊瑚礁遥感监测主要依靠卫星或航空观测,这些观测提供了广泛的空间覆盖,但缺乏捕捉单个珊瑚群落详细结构和健康状况所需的细粒度能力。相比之下,水下摄影利用近距离、高分辨率图像观测,可被视为一种非传统的遥感形式,能够在像素水平上对不同健康状况的珊瑚进行细粒度评估。在这种情况下,水下图像语义分割发挥了至关重要的作用,从复杂的水下成像场景中提取有区别的视觉特征,并基于专家注释的标签实现不同珊瑚健康状态的自动分类和识别。这些语义信息可以用来推导相应的生态指标。虽然基于深度学习的珊瑚图像分割方法已被证明对水下珊瑚遥感监测任务有效,但其在不同监测场景下的泛化能力仍然存在挑战。这些挑战源于珊瑚图像数据分布的变化以及深度学习模型固有的数据驱动性质。在这项研究中,我们首次将因果学习引入到珊瑚图像分割中,并提出了CDNet,这是一个新的因果驱动的语义分割框架,旨在从复杂和动态的水下环境中稳健地识别多种珊瑚健康状态——活的、死的和漂白的。具体来说,我们引入了一个因果去相关模块来减少不相关特征中的虚假相关性,确保网络可以专注于不同珊瑚健康状态的内在因果特征。此外,提出了增强的特征聚合模块,以提高模型对多尺度细节和复杂边界的捕获能力。大量实验表明,CDNet实现了始终如一的高分割性能,在不同时空域的数据集上,平均mF1得分超过60%。与最先进的方法相比,其mIoU提高了4.3%至40%。此外,CDNet在反映实际水下珊瑚遥感监测挑战(包括内部几何变换、外部环境变化和不同上下文依赖关系)的模拟场景下以及在不同的真实水下珊瑚数据集上保持准确和一致的分割性能。该方法为准确、快速的珊瑚礁时空监测提供了可靠、可扩展的解决方案,为珊瑚礁的长期保护和气候适应能力提供了实用价值。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Causal learning-driven semantic segmentation for robust coral health status identification
Global warming is accelerating the degradation of coral reef ecosystems, making accurate monitoring of coral reef health status crucial for their protection and restoration. Traditional coral reef remote sensing monitoring primarily relies on satellite or aerial observations, which provide broad spatial coverage but lack the fine-grained capability needed to capture the detailed structure and health status of individual coral colonies. In contrast, underwater photography utilizes close-range, high-resolution image-based observation, which can be considered a non-traditional form of remote sensing, to enable fine-grained assessment of corals with varying health status at pixel level. In this context, underwater image semantic segmentation plays a vital role by extracting discriminative visual features from complex underwater imaging scenes and enabling the automated classification and identification of different coral health status, based on expert-annotated labels. This semantic information can then be used to derive corresponding ecological indicators. While deep learning-based coral image segmentation methods have been proven effective for underwater coral remote sensing monitoring tasks, challenges remain regarding their generalization ability across diverse monitoring scenarios. These challenges stem from shifts in coral image data distributions and the inherent data-driven nature of deep learning models. In this study, we introduce causal learning into coral image segmentation for the first time and propose CDNet, a novel causal-driven semantic segmentation framework designed to robustly identify multiple coral health states — live, dead, and bleached — from imagery in complex and dynamic underwater environments. Specifically, we introduce a Causal Decorrelation Module to reduce spurious correlations within irrelevant features, ensuring that the network can focus on the intrinsic causal features of different coral health status. Additionally, an Enhanced Feature Aggregation Module is proposed to improve the model’s ability to capture multi-scale details and complex boundaries. Extensive experiments demonstrate that CDNet achieves consistently high segmentation performance, with an average mF1 score exceeding 60% across datasets from diverse temporal and spatial domains. Compare to state-of-the-art methods, its mIoU improves by 4.3% to 40%. Moreover, CDNet maintains accurate and consistent segmentation performance under simulated scenarios reflecting practical underwater coral remote sensing monitoring challenges (including internal geometric transformations, variations in external environments, and different contextual dependencies), as well as on diverse real-world underwater coral datasets. Our proposed method provides a reliable and scalable solution for accurate and rapid spatiotemporal monitoring of coral reefs, offering practical value for long-term conservation and climate resilience of coral reefs.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
ISPRS Journal of Photogrammetry and Remote Sensing
ISPRS Journal of Photogrammetry and Remote Sensing 工程技术-成像科学与照相技术
CiteScore
21.00
自引率
6.30%
发文量
273
审稿时长
40 days
期刊介绍: The ISPRS Journal of Photogrammetry and Remote Sensing (P&RS) serves as the official journal of the International Society for Photogrammetry and Remote Sensing (ISPRS). It acts as a platform for scientists and professionals worldwide who are involved in various disciplines that utilize photogrammetry, remote sensing, spatial information systems, computer vision, and related fields. The journal aims to facilitate communication and dissemination of advancements in these disciplines, while also acting as a comprehensive source of reference and archive. P&RS endeavors to publish high-quality, peer-reviewed research papers that are preferably original and have not been published before. These papers can cover scientific/research, technological development, or application/practical aspects. Additionally, the journal welcomes papers that are based on presentations from ISPRS meetings, as long as they are considered significant contributions to the aforementioned fields. In particular, P&RS encourages the submission of papers that are of broad scientific interest, showcase innovative applications (especially in emerging fields), have an interdisciplinary focus, discuss topics that have received limited attention in P&RS or related journals, or explore new directions in scientific or professional realms. It is preferred that theoretical papers include practical applications, while papers focusing on systems and applications should include a theoretical background.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信