SIN-Seg: A joint spatial-spectral information fusion model for medical image segmentation

IF 4.4 2区 生物学 Q2 BIOCHEMISTRY & MOLECULAR BIOLOGY
Siyuan Dai , Kai Ye , Charlie Zhan , Haoteng Tang , Liang Zhan
{"title":"SIN-Seg: A joint spatial-spectral information fusion model for medical image segmentation","authors":"Siyuan Dai ,&nbsp;Kai Ye ,&nbsp;Charlie Zhan ,&nbsp;Haoteng Tang ,&nbsp;Liang Zhan","doi":"10.1016/j.csbj.2025.02.024","DOIUrl":null,"url":null,"abstract":"<div><div>In recent years, the application of deep convolutional neural networks (DCNNs) to medical image segmentation has shown significant promise in computer-aided detection and diagnosis (CAD). Leveraging features from different spaces (i.e. Euclidean, non-Euclidean, and spectrum spaces) and multi-modalities of data have the potential to improve the information available to the CAD system, enhancing both effectiveness and efficiency. However, directly acquiring data from different spaces across multi-modalities is often prohibitively expensive and time-consuming. Consequently, most current medical image segmentation techniques are confined to the spatial domain, which is limited to utilizing scanned images from MRI, CT, PET, etc. Here, we introduce an innovative Joint Spatial-Spectral Information Fusion method which requires no additional data collection for CAD. We translate existing single-modality data into a new domain to extract features from an alternative space. Specifically, we apply Discrete Cosine Transformation (DCT) to enter the spectrum domain, thereby accessing supplementary feature information from an alternate space. Recognizing that information from different spaces typically necessitates complex alignment modules, we introduce a contrastive loss function for achieving feature alignment before synchronizing information across different feature spaces. Our empirical results illustrate the greater effectiveness of our model in harnessing additional information from the spectrum-based space and affirm its superior performance against influential state-of-the-art segmentation baselines. The code is available at <span><span>https://github.com/Auroradsy/SIN-Seg</span><svg><path></path></svg></span>.</div></div>","PeriodicalId":10715,"journal":{"name":"Computational and structural biotechnology journal","volume":"27 ","pages":"Pages 744-752"},"PeriodicalIF":4.4000,"publicationDate":"2025-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computational and structural biotechnology journal","FirstCategoryId":"99","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2001037025000510","RegionNum":2,"RegionCategory":"生物学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"BIOCHEMISTRY & MOLECULAR BIOLOGY","Score":null,"Total":0}
引用次数: 0

Abstract

In recent years, the application of deep convolutional neural networks (DCNNs) to medical image segmentation has shown significant promise in computer-aided detection and diagnosis (CAD). Leveraging features from different spaces (i.e. Euclidean, non-Euclidean, and spectrum spaces) and multi-modalities of data have the potential to improve the information available to the CAD system, enhancing both effectiveness and efficiency. However, directly acquiring data from different spaces across multi-modalities is often prohibitively expensive and time-consuming. Consequently, most current medical image segmentation techniques are confined to the spatial domain, which is limited to utilizing scanned images from MRI, CT, PET, etc. Here, we introduce an innovative Joint Spatial-Spectral Information Fusion method which requires no additional data collection for CAD. We translate existing single-modality data into a new domain to extract features from an alternative space. Specifically, we apply Discrete Cosine Transformation (DCT) to enter the spectrum domain, thereby accessing supplementary feature information from an alternate space. Recognizing that information from different spaces typically necessitates complex alignment modules, we introduce a contrastive loss function for achieving feature alignment before synchronizing information across different feature spaces. Our empirical results illustrate the greater effectiveness of our model in harnessing additional information from the spectrum-based space and affirm its superior performance against influential state-of-the-art segmentation baselines. The code is available at https://github.com/Auroradsy/SIN-Seg.
求助全文
约1分钟内获得全文 求助全文
来源期刊
Computational and structural biotechnology journal
Computational and structural biotechnology journal Biochemistry, Genetics and Molecular Biology-Biophysics
CiteScore
9.30
自引率
3.30%
发文量
540
审稿时长
6 weeks
期刊介绍: Computational and Structural Biotechnology Journal (CSBJ) is an online gold open access journal publishing research articles and reviews after full peer review. All articles are published, without barriers to access, immediately upon acceptance. The journal places a strong emphasis on functional and mechanistic understanding of how molecular components in a biological process work together through the application of computational methods. Structural data may provide such insights, but they are not a pre-requisite for publication in the journal. Specific areas of interest include, but are not limited to: Structure and function of proteins, nucleic acids and other macromolecules Structure and function of multi-component complexes Protein folding, processing and degradation Enzymology Computational and structural studies of plant systems Microbial Informatics Genomics Proteomics Metabolomics Algorithms and Hypothesis in Bioinformatics Mathematical and Theoretical Biology Computational Chemistry and Drug Discovery Microscopy and Molecular Imaging Nanotechnology Systems and Synthetic Biology
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信