Cross-layer context boundary guided network for crack segmentation

IF 8 1区 工程技术 Q1 CONSTRUCTION & BUILDING TECHNOLOGY
Hang Sun , Tianyu Zhang , Mei Yu , Shun Ren , Dong Wang
{"title":"Cross-layer context boundary guided network for crack segmentation","authors":"Hang Sun ,&nbsp;Tianyu Zhang ,&nbsp;Mei Yu ,&nbsp;Shun Ren ,&nbsp;Dong Wang","doi":"10.1016/j.conbuildmat.2025.143975","DOIUrl":null,"url":null,"abstract":"<div><div>Recently, Convolutional Neural Networks (CNNs) and Transformers have been extensively investigated for concrete crack segmentation, achieving remarkable performance. However, most CNN-Transformer-based crack segmentation methods overlook the exploration of contextual relationships between adjacent layers, which are critical for enhancing crack perception. Moreover, current algorithms fail to fully exploit the physical characteristics of cracks (e.g., geometric shape and boundary correlations), leading to reduced segmentation performance on low-contrast boundaries. To address these issues, we propose a Cross-Layer Context Boundary Guided Network (CCBG-Net) for Crack Segmentation. Specifically, a Bidirectional Cross-layer Context-aware (BCCA) module is introduced, which extracts multi-scale features from adjacent layers and performs bidirectional feature fusion with the current layer to obtain the contextual relationships of adjacent layers for enhanced crack feature representation, especially thin cracks. Furthermore, a Boundary-Object-Guided Interaction (BOGI) module is developed to decouple boundary information and guide crack features through global-channel interaction to optimize boundary contours, providing discrimination ability for crack boundaries. Experimental results on several challenging benchmark datasets demonstrate that our CCBG-Net outperforms state-of-the-art crack segmentation methods. The code is available at <span><span>https://github.com/zty-acc/CCBG-Net</span><svg><path></path></svg></span>.</div></div>","PeriodicalId":288,"journal":{"name":"Construction and Building Materials","volume":"498 ","pages":"Article 143975"},"PeriodicalIF":8.0000,"publicationDate":"2025-10-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Construction and Building Materials","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0950061825041261","RegionNum":1,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"CONSTRUCTION & BUILDING TECHNOLOGY","Score":null,"Total":0}
引用次数: 0

Abstract

Recently, Convolutional Neural Networks (CNNs) and Transformers have been extensively investigated for concrete crack segmentation, achieving remarkable performance. However, most CNN-Transformer-based crack segmentation methods overlook the exploration of contextual relationships between adjacent layers, which are critical for enhancing crack perception. Moreover, current algorithms fail to fully exploit the physical characteristics of cracks (e.g., geometric shape and boundary correlations), leading to reduced segmentation performance on low-contrast boundaries. To address these issues, we propose a Cross-Layer Context Boundary Guided Network (CCBG-Net) for Crack Segmentation. Specifically, a Bidirectional Cross-layer Context-aware (BCCA) module is introduced, which extracts multi-scale features from adjacent layers and performs bidirectional feature fusion with the current layer to obtain the contextual relationships of adjacent layers for enhanced crack feature representation, especially thin cracks. Furthermore, a Boundary-Object-Guided Interaction (BOGI) module is developed to decouple boundary information and guide crack features through global-channel interaction to optimize boundary contours, providing discrimination ability for crack boundaries. Experimental results on several challenging benchmark datasets demonstrate that our CCBG-Net outperforms state-of-the-art crack segmentation methods. The code is available at https://github.com/zty-acc/CCBG-Net.
跨层上下文边界引导网络裂缝分割
近年来,卷积神经网络(Convolutional Neural Networks, cnn)和transformer在混凝土裂缝分割中得到了广泛的研究,并取得了显著的效果。然而,大多数基于cnn - transformer的裂缝分割方法忽略了对相邻层之间上下文关系的探索,而这对于增强裂缝感知至关重要。此外,目前的算法不能充分利用裂缝的物理特征(如几何形状和边界相关性),导致在低对比度边界上的分割性能下降。为了解决这些问题,我们提出了一种用于裂缝分割的跨层上下文边界引导网络(CCBG-Net)。具体而言,引入了双向跨层上下文感知(BCCA)模块,该模块从相邻层中提取多尺度特征,并与当前层进行双向特征融合,获得相邻层的上下文关系,以增强裂纹特征表征,特别是薄裂纹。在此基础上,开发了边界-对象引导交互(BOGI)模块,通过全局通道交互对边界信息进行解耦,引导裂缝特征,优化边界轮廓,提供裂缝边界识别能力。在几个具有挑战性的基准数据集上的实验结果表明,我们的CCBG-Net优于最先进的裂缝分割方法。代码可在https://github.com/zty-acc/CCBG-Net上获得。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Construction and Building Materials
Construction and Building Materials 工程技术-材料科学:综合
CiteScore
13.80
自引率
21.60%
发文量
3632
审稿时长
82 days
期刊介绍: Construction and Building Materials offers an international platform for sharing innovative and original research and development in the realm of construction and building materials, along with their practical applications in new projects and repair practices. The journal publishes a diverse array of pioneering research and application papers, detailing laboratory investigations and, to a limited extent, numerical analyses or reports on full-scale projects. Multi-part papers are discouraged. Additionally, Construction and Building Materials features comprehensive case studies and insightful review articles that contribute to new insights in the field. Our focus is on papers related to construction materials, excluding those on structural engineering, geotechnics, and unbound highway layers. Covered materials and technologies encompass cement, concrete reinforcement, bricks and mortars, additives, corrosion technology, ceramics, timber, steel, polymers, glass fibers, recycled materials, bamboo, rammed earth, non-conventional building materials, bituminous materials, and applications in railway materials.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信