TCCFNet: a semantic segmentation method for mangrove remote sensing images based on two-channel cross-fusion networks

IF 2.8 2区 生物学 Q1 MARINE & FRESHWATER BIOLOGY
Lixiang Fu, Yaoru Wang, Shulei Wu, Jiasen Zhuang, Zhongqiang Wu, Jian Wu, Huandong Chen, Yukai Chen
{"title":"TCCFNet: a semantic segmentation method for mangrove remote sensing images based on two-channel cross-fusion networks","authors":"Lixiang Fu, Yaoru Wang, Shulei Wu, Jiasen Zhuang, Zhongqiang Wu, Jian Wu, Huandong Chen, Yukai Chen","doi":"10.3389/fmars.2025.1535917","DOIUrl":null,"url":null,"abstract":"Mangrove ecosystems play a crucial role in coastal environments. However, due to the complexity of mangrove distribution and the similarity among different categories in remote sensing images, traditional image segmentation methods struggle to accurately identify mangrove regions. Deep learning techniques, particularly those based on CNNs and Transformers, have demonstrated significant progress in remote sensing image analysis. This study proposes TCCFNet (Two-Channel Cross-Fusion Network) to enhance the accuracy and robustness of mangrove remote sensing image semantic segmentation. This study introduces a dual-backbone network architecture that combines ResNet for fine-grained local feature extraction and Swin Transformer for global context modeling. ResNet improves the identification of small targets, while Swin Transformer enhances the segmentation of large-scale features. Additionally, a Cross Integration Module (CIM) is incorporated to strengthen multi-scale feature fusion and enhance adaptability to complex scenarios. The dataset consists of 230 high-resolution remote sensing images, with 80% used for training and 20% for validation. The experimental setup employs the Adam optimizer with an initial learning rate of 0.0001 and a total of 450 training iterations, using cross-entropy loss for optimization. Experimental results demonstrate that TCCFNet outperforms existing methods in mangrove remote sensing image segmentation. Compared with state-of-the-art models such as MSFANet and DC-Swin, TCCFNet achieves superior performance with a Mean Intersection over Union (MIoU) of 88.34%, Pixel Accuracy (PA) of 97.35%, and F1-score of 93.55%. Particularly, the segmentation accuracy for mangrove categories reaches 99.04%. Furthermore, TCCFNet excels in distinguishing similar categories, handling complex backgrounds, and improving boundary detection. TCCFNet demonstrates outstanding performance in mangrove remote sensing image segmentation, primarily due to its dual-backbone design and CIM module. However, the model still has limitations in computational efficiency and small-target recognition. Future research could focus on developing lightweight Transformer architectures, optimizing data augmentation strategies, and expanding the dataset to diverse remote sensing scenarios to further enhance generalization capabilities. This study presents a novel mangrove remote sensing image segmentation approach—TCCFNet. By integrating ResNet and Swin Transformer with the Cross Integration Module (CIM), the model significantly improves segmentation accuracy, particularly in distinguishing complex categories and large-scale targets. TCCFNet serves as a valuable tool for mangrove remote sensing monitoring, providing more precise data support for ecological conservation efforts.","PeriodicalId":12479,"journal":{"name":"Frontiers in Marine Science","volume":"183 1","pages":""},"PeriodicalIF":2.8000,"publicationDate":"2025-04-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Frontiers in Marine Science","FirstCategoryId":"99","ListUrlMain":"https://doi.org/10.3389/fmars.2025.1535917","RegionNum":2,"RegionCategory":"生物学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MARINE & FRESHWATER BIOLOGY","Score":null,"Total":0}
引用次数: 0

Abstract

Mangrove ecosystems play a crucial role in coastal environments. However, due to the complexity of mangrove distribution and the similarity among different categories in remote sensing images, traditional image segmentation methods struggle to accurately identify mangrove regions. Deep learning techniques, particularly those based on CNNs and Transformers, have demonstrated significant progress in remote sensing image analysis. This study proposes TCCFNet (Two-Channel Cross-Fusion Network) to enhance the accuracy and robustness of mangrove remote sensing image semantic segmentation. This study introduces a dual-backbone network architecture that combines ResNet for fine-grained local feature extraction and Swin Transformer for global context modeling. ResNet improves the identification of small targets, while Swin Transformer enhances the segmentation of large-scale features. Additionally, a Cross Integration Module (CIM) is incorporated to strengthen multi-scale feature fusion and enhance adaptability to complex scenarios. The dataset consists of 230 high-resolution remote sensing images, with 80% used for training and 20% for validation. The experimental setup employs the Adam optimizer with an initial learning rate of 0.0001 and a total of 450 training iterations, using cross-entropy loss for optimization. Experimental results demonstrate that TCCFNet outperforms existing methods in mangrove remote sensing image segmentation. Compared with state-of-the-art models such as MSFANet and DC-Swin, TCCFNet achieves superior performance with a Mean Intersection over Union (MIoU) of 88.34%, Pixel Accuracy (PA) of 97.35%, and F1-score of 93.55%. Particularly, the segmentation accuracy for mangrove categories reaches 99.04%. Furthermore, TCCFNet excels in distinguishing similar categories, handling complex backgrounds, and improving boundary detection. TCCFNet demonstrates outstanding performance in mangrove remote sensing image segmentation, primarily due to its dual-backbone design and CIM module. However, the model still has limitations in computational efficiency and small-target recognition. Future research could focus on developing lightweight Transformer architectures, optimizing data augmentation strategies, and expanding the dataset to diverse remote sensing scenarios to further enhance generalization capabilities. This study presents a novel mangrove remote sensing image segmentation approach—TCCFNet. By integrating ResNet and Swin Transformer with the Cross Integration Module (CIM), the model significantly improves segmentation accuracy, particularly in distinguishing complex categories and large-scale targets. TCCFNet serves as a valuable tool for mangrove remote sensing monitoring, providing more precise data support for ecological conservation efforts.
TCCFNet:基于双通道交叉融合网络的红树林遥感图像语义分割方法
红树林生态系统在沿海环境中起着至关重要的作用。然而,由于红树林分布的复杂性和遥感图像中不同类别之间的相似性,传统的图像分割方法难以准确识别红树林区域。深度学习技术,特别是基于cnn和Transformers的深度学习技术,在遥感图像分析方面取得了重大进展。为了提高红树林遥感图像语义分割的准确性和鲁棒性,本研究提出了双通道交叉融合网络(TCCFNet)。本研究引入了一种双骨干网络架构,该架构结合了用于细粒度局部特征提取的ResNet和用于全局上下文建模的Swin Transformer。ResNet改进了对小目标的识别,Swin Transformer增强了对大规模特征的分割。同时,集成了CIM (Cross Integration Module)模块,加强多尺度特征融合,增强对复杂场景的适应能力。该数据集由230张高分辨率遥感图像组成,其中80%用于训练,20%用于验证。实验设置采用Adam优化器,初始学习率为0.0001,总共进行450次训练迭代,使用交叉熵损失进行优化。实验结果表明,TCCFNet在红树林遥感图像分割方面优于现有方法。与MSFANet和DC-Swin等最先进的模型相比,TCCFNet的平均交叉度(MIoU)为88.34%,像素精度(PA)为97.35%,F1-score为93.55%。其中,对红树林分类的分割准确率达到99.04%。此外,TCCFNet在识别相似类别、处理复杂背景和改进边界检测方面表现出色。TCCFNet在红树林遥感图像分割中表现出色,这主要得益于其双主干设计和CIM模块。然而,该模型在计算效率和小目标识别方面仍然存在局限性。未来的研究可以集中在开发轻量级的Transformer架构,优化数据增强策略,并将数据集扩展到不同的遥感场景,以进一步提高泛化能力。提出了一种新的红树林遥感图像分割方法——tccfnet。通过将ResNet和Swin Transformer与交叉集成模块(CIM)集成,该模型显著提高了分割精度,特别是在区分复杂类别和大规模目标方面。TCCFNet是红树林遥感监测的重要工具,为生态保护工作提供了更精确的数据支持。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Frontiers in Marine Science
Frontiers in Marine Science Agricultural and Biological Sciences-Aquatic Science
CiteScore
5.10
自引率
16.20%
发文量
2443
审稿时长
14 weeks
期刊介绍: Frontiers in Marine Science publishes rigorously peer-reviewed research that advances our understanding of all aspects of the environment, biology, ecosystem functioning and human interactions with the oceans. Field Chief Editor Carlos M. Duarte at King Abdullah University of Science and Technology Thuwal is supported by an outstanding Editorial Board of international researchers. This multidisciplinary open-access journal is at the forefront of disseminating and communicating scientific knowledge and impactful discoveries to researchers, academics, policy makers and the public worldwide. With the human population predicted to reach 9 billion people by 2050, it is clear that traditional land resources will not suffice to meet the demand for food or energy, required to support high-quality livelihoods. As a result, the oceans are emerging as a source of untapped assets, with new innovative industries, such as aquaculture, marine biotechnology, marine energy and deep-sea mining growing rapidly under a new era characterized by rapid growth of a blue, ocean-based economy. The sustainability of the blue economy is closely dependent on our knowledge about how to mitigate the impacts of the multiple pressures on the ocean ecosystem associated with the increased scale and diversification of industry operations in the ocean and global human pressures on the environment. Therefore, Frontiers in Marine Science particularly welcomes the communication of research outcomes addressing ocean-based solutions for the emerging challenges, including improved forecasting and observational capacities, understanding biodiversity and ecosystem problems, locally and globally, effective management strategies to maintain ocean health, and an improved capacity to sustainably derive resources from the oceans.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信