[SG-UNet:一种利用全局关注和自校准卷积增强的黑色素瘤分割模型]。

Q3 Medicine
Huanyu Ji, Rui Wang, Shengxiang Gao, Wengang Che
{"title":"[SG-UNet:一种利用全局关注和自校准卷积增强的黑色素瘤分割模型]。","authors":"Huanyu Ji, Rui Wang, Shengxiang Gao, Wengang Che","doi":"10.12122/j.issn.1673-4254.2025.06.21","DOIUrl":null,"url":null,"abstract":"<p><strong>Objectives: </strong>We propose a new melanoma segmentation model, SG-UNet, to enhance the precision of melanoma segmentation in dermascopy images to facilitate early melanoma detection.</p><p><strong>Methods: </strong>We utilized a U-shaped convolutional neural network, UNet, and made improvements to its backbone, skip connections, and downsampling pooling sections. In the backbone, with reference to the structure of VGG, we increased the number of convolutions from 10 to 13 in the downsampling part of UNet to achieve a deepened network hierarchy that allowed capture of more refined feature representations. To further enhance feature extraction and detail recognition, we replaced the traditional convolution the backbone section with self-calibrated convolution to enhance the model's ability to capture both spatial and channel dimensional features. In the pooling part, the original pooling layer was replaced by Haar wavelet downsampling to achieve more effective multi-scale feature fusion and reduce the spatial resolution of the feature map. The global attention mechanism was then incorporated into the skip connections at each layer to enhance the understanding of contextual information of the image.</p><p><strong>Results: </strong>The experimental results showed that the SG-UNet model achieved significantly improved segmentation accuracy on ISIC 2017 and ISIC 2018 datasets as compared with other current state-of-the-art segmentation models, with Dice reached 92.41% and 86.62% and IoU reaching 92.31% and 86.48% on the two datasets, respectively.</p><p><strong>Conclusions: </strong>The proposed model is capable of effective and accurate segmentation of melanoma from dermoscopy images.</p>","PeriodicalId":18962,"journal":{"name":"南方医科大学学报杂志","volume":"45 6","pages":"1317-1326"},"PeriodicalIF":0.0000,"publicationDate":"2025-06-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12204843/pdf/","citationCount":"0","resultStr":"{\"title\":\"[SG-UNet: a melanoma segmentation model enhanced with global attention and self-calibrated convolution].\",\"authors\":\"Huanyu Ji, Rui Wang, Shengxiang Gao, Wengang Che\",\"doi\":\"10.12122/j.issn.1673-4254.2025.06.21\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><strong>Objectives: </strong>We propose a new melanoma segmentation model, SG-UNet, to enhance the precision of melanoma segmentation in dermascopy images to facilitate early melanoma detection.</p><p><strong>Methods: </strong>We utilized a U-shaped convolutional neural network, UNet, and made improvements to its backbone, skip connections, and downsampling pooling sections. In the backbone, with reference to the structure of VGG, we increased the number of convolutions from 10 to 13 in the downsampling part of UNet to achieve a deepened network hierarchy that allowed capture of more refined feature representations. To further enhance feature extraction and detail recognition, we replaced the traditional convolution the backbone section with self-calibrated convolution to enhance the model's ability to capture both spatial and channel dimensional features. In the pooling part, the original pooling layer was replaced by Haar wavelet downsampling to achieve more effective multi-scale feature fusion and reduce the spatial resolution of the feature map. The global attention mechanism was then incorporated into the skip connections at each layer to enhance the understanding of contextual information of the image.</p><p><strong>Results: </strong>The experimental results showed that the SG-UNet model achieved significantly improved segmentation accuracy on ISIC 2017 and ISIC 2018 datasets as compared with other current state-of-the-art segmentation models, with Dice reached 92.41% and 86.62% and IoU reaching 92.31% and 86.48% on the two datasets, respectively.</p><p><strong>Conclusions: </strong>The proposed model is capable of effective and accurate segmentation of melanoma from dermoscopy images.</p>\",\"PeriodicalId\":18962,\"journal\":{\"name\":\"南方医科大学学报杂志\",\"volume\":\"45 6\",\"pages\":\"1317-1326\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2025-06-20\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12204843/pdf/\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"南方医科大学学报杂志\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.12122/j.issn.1673-4254.2025.06.21\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"Medicine\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"南方医科大学学报杂志","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.12122/j.issn.1673-4254.2025.06.21","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"Medicine","Score":null,"Total":0}
引用次数: 0

摘要

目的:提出一种新的黑色素瘤分割模型SG-UNet,以提高皮肤拷贝图像中黑色素瘤的分割精度,促进黑色素瘤的早期检测。方法:我们使用了一个u形卷积神经网络,UNet,并对其主干、跳过连接和下采样池部分进行了改进。在主干中,参考VGG的结构,我们在UNet的下采样部分将卷积数从10增加到13,以实现更深的网络层次结构,从而可以捕获更精细的特征表示。为了进一步增强特征提取和细节识别能力,我们用自校准卷积取代传统的对骨干截面的卷积,以增强模型对空间和通道维度特征的捕获能力。在池化部分,采用Haar小波下采样取代原有池化层,实现更有效的多尺度特征融合,降低特征图的空间分辨率。然后将全局注意机制整合到每一层的跳跃连接中,以增强对图像上下文信息的理解。结果:实验结果表明,SG-UNet模型在ISIC 2017和ISIC 2018数据集上的分割准确率显著提高,Dice在这两个数据集上分别达到92.41%和86.62%,IoU分别达到92.31%和86.48%。结论:该模型能够有效、准确地从皮肤镜图像中分割黑色素瘤。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
[SG-UNet: a melanoma segmentation model enhanced with global attention and self-calibrated convolution].

Objectives: We propose a new melanoma segmentation model, SG-UNet, to enhance the precision of melanoma segmentation in dermascopy images to facilitate early melanoma detection.

Methods: We utilized a U-shaped convolutional neural network, UNet, and made improvements to its backbone, skip connections, and downsampling pooling sections. In the backbone, with reference to the structure of VGG, we increased the number of convolutions from 10 to 13 in the downsampling part of UNet to achieve a deepened network hierarchy that allowed capture of more refined feature representations. To further enhance feature extraction and detail recognition, we replaced the traditional convolution the backbone section with self-calibrated convolution to enhance the model's ability to capture both spatial and channel dimensional features. In the pooling part, the original pooling layer was replaced by Haar wavelet downsampling to achieve more effective multi-scale feature fusion and reduce the spatial resolution of the feature map. The global attention mechanism was then incorporated into the skip connections at each layer to enhance the understanding of contextual information of the image.

Results: The experimental results showed that the SG-UNet model achieved significantly improved segmentation accuracy on ISIC 2017 and ISIC 2018 datasets as compared with other current state-of-the-art segmentation models, with Dice reached 92.41% and 86.62% and IoU reaching 92.31% and 86.48% on the two datasets, respectively.

Conclusions: The proposed model is capable of effective and accurate segmentation of melanoma from dermoscopy images.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
南方医科大学学报杂志
南方医科大学学报杂志 Medicine-Medicine (all)
CiteScore
1.50
自引率
0.00%
发文量
208
期刊介绍:
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信