基于通道关注的低复杂度语义分割模型滤波剪枝方法

IF 4.9
Md. Bipul Hossain, Na Gong, Mohamed Shaban
{"title":"基于通道关注的低复杂度语义分割模型滤波剪枝方法","authors":"Md. Bipul Hossain,&nbsp;Na Gong,&nbsp;Mohamed Shaban","doi":"10.1016/j.mlwa.2025.100725","DOIUrl":null,"url":null,"abstract":"<div><div>Semantic segmentation is the area of classifying each pixel in an image using a deep learning model. Examples of widely used semantic segmentation models are the U-Net and DeeplabV3+ models. While the aforementioned models have been deemed very successful in segmenting medical targets including organs and diseases in high resolution images, the computational complexity represents a burden for the real-time application of the algorithms or the deployment of the models on resource-constrained platforms. Until recently, few methods have been introduced for optimizing or pruning of the parameters of the semantic segmentation models. In this paper, we propose two novel channel attention-based filter pruning techniques (i.e., Sub-Sampling Channel Attention (SACA) and Self-Attention Based Attention (SBCA)) in order to reduce the complexity of the semantic segmentation models while maintaining high performance with respect to the benchmark models. This is realized by recognizing the contextual importance of the feature maps in each layer of the models and the significance of each filter to the final model performance. The proposed optimization methods have been validated on the U-Net and DeeplabV3+ models using both lung and skin lesion datasets. The proposed approaches achieved a pruned model performance (i.e., dice coefficient) of up to 96%, as well as an extensively reduced complexity (i.e., percentage of remaining parameters down to 1.1%, model size down to 1.22 MB and number of GFLOPS down to 1.06), outperforming the benchmark magnitude based (i.e., <em>l1-norm</em>, and <em>l2-norm</em>) and the attention-based (i.e., SE, ECA, and CBAM CA) filter pruning methods.</div></div>","PeriodicalId":74093,"journal":{"name":"Machine learning with applications","volume":"21 ","pages":"Article 100725"},"PeriodicalIF":4.9000,"publicationDate":"2025-08-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Novel channel attention-based filter pruning methods for low-complexity semantic segmentation models\",\"authors\":\"Md. Bipul Hossain,&nbsp;Na Gong,&nbsp;Mohamed Shaban\",\"doi\":\"10.1016/j.mlwa.2025.100725\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Semantic segmentation is the area of classifying each pixel in an image using a deep learning model. Examples of widely used semantic segmentation models are the U-Net and DeeplabV3+ models. While the aforementioned models have been deemed very successful in segmenting medical targets including organs and diseases in high resolution images, the computational complexity represents a burden for the real-time application of the algorithms or the deployment of the models on resource-constrained platforms. Until recently, few methods have been introduced for optimizing or pruning of the parameters of the semantic segmentation models. In this paper, we propose two novel channel attention-based filter pruning techniques (i.e., Sub-Sampling Channel Attention (SACA) and Self-Attention Based Attention (SBCA)) in order to reduce the complexity of the semantic segmentation models while maintaining high performance with respect to the benchmark models. This is realized by recognizing the contextual importance of the feature maps in each layer of the models and the significance of each filter to the final model performance. The proposed optimization methods have been validated on the U-Net and DeeplabV3+ models using both lung and skin lesion datasets. The proposed approaches achieved a pruned model performance (i.e., dice coefficient) of up to 96%, as well as an extensively reduced complexity (i.e., percentage of remaining parameters down to 1.1%, model size down to 1.22 MB and number of GFLOPS down to 1.06), outperforming the benchmark magnitude based (i.e., <em>l1-norm</em>, and <em>l2-norm</em>) and the attention-based (i.e., SE, ECA, and CBAM CA) filter pruning methods.</div></div>\",\"PeriodicalId\":74093,\"journal\":{\"name\":\"Machine learning with applications\",\"volume\":\"21 \",\"pages\":\"Article 100725\"},\"PeriodicalIF\":4.9000,\"publicationDate\":\"2025-08-16\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Machine learning with applications\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S2666827025001082\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Machine learning with applications","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2666827025001082","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

语义分割是使用深度学习模型对图像中的每个像素进行分类的领域。广泛使用的语义分割模型的例子是U-Net和DeeplabV3+模型。虽然上述模型在分割高分辨率图像中的医学目标(包括器官和疾病)方面被认为非常成功,但计算复杂性对算法的实时应用或在资源有限的平台上部署模型构成了负担。到目前为止,对语义分割模型的参数进行优化或裁剪的方法还很少。为了降低语义分割模型的复杂性,同时保持相对于基准模型的高性能,本文提出了两种新的基于通道注意力的滤波器修剪技术(即子采样通道注意力(SACA)和基于自注意力的注意力(SBCA))。这是通过识别模型每层特征映射的上下文重要性以及每个过滤器对最终模型性能的重要性来实现的。所提出的优化方法已在U-Net和DeeplabV3+模型上使用肺和皮肤病变数据集进行了验证。所提出的方法实现了高达96%的剪枝模型性能(即骰子系数),以及广泛降低的复杂性(即剩余参数百分比降至1.1%,模型大小降至1.22 MB, GFLOPS数量降至1.06),优于基准幅度(即11 -范数和12 -范数)和基于注意力(即SE, ECA和CBAM CA)的滤波器剪枝方法。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Novel channel attention-based filter pruning methods for low-complexity semantic segmentation models
Semantic segmentation is the area of classifying each pixel in an image using a deep learning model. Examples of widely used semantic segmentation models are the U-Net and DeeplabV3+ models. While the aforementioned models have been deemed very successful in segmenting medical targets including organs and diseases in high resolution images, the computational complexity represents a burden for the real-time application of the algorithms or the deployment of the models on resource-constrained platforms. Until recently, few methods have been introduced for optimizing or pruning of the parameters of the semantic segmentation models. In this paper, we propose two novel channel attention-based filter pruning techniques (i.e., Sub-Sampling Channel Attention (SACA) and Self-Attention Based Attention (SBCA)) in order to reduce the complexity of the semantic segmentation models while maintaining high performance with respect to the benchmark models. This is realized by recognizing the contextual importance of the feature maps in each layer of the models and the significance of each filter to the final model performance. The proposed optimization methods have been validated on the U-Net and DeeplabV3+ models using both lung and skin lesion datasets. The proposed approaches achieved a pruned model performance (i.e., dice coefficient) of up to 96%, as well as an extensively reduced complexity (i.e., percentage of remaining parameters down to 1.1%, model size down to 1.22 MB and number of GFLOPS down to 1.06), outperforming the benchmark magnitude based (i.e., l1-norm, and l2-norm) and the attention-based (i.e., SE, ECA, and CBAM CA) filter pruning methods.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Machine learning with applications
Machine learning with applications Management Science and Operations Research, Artificial Intelligence, Computer Science Applications
自引率
0.00%
发文量
0
审稿时长
98 days
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信