A Lightweight 3D Distillation Volumetric Transformer for 3D MRI Super-Resolution.

IF 6.7 2区 医学 Q1 COMPUTER SCIENCE, INFORMATION SYSTEMS
Jianwei Zhao, Tao Hong, Hao Qi, Zhenghua Zhou, Hai Wang
{"title":"A Lightweight 3D Distillation Volumetric Transformer for 3D MRI Super-Resolution.","authors":"Jianwei Zhao, Tao Hong, Hao Qi, Zhenghua Zhou, Hai Wang","doi":"10.1109/JBHI.2025.3555603","DOIUrl":null,"url":null,"abstract":"<p><p>Although existing 3D super-resolution methods for magnetic resonance imaging (MRI) volumetric data can provide better visual images than some traditional 2D methods, they should face challenge of increasing network's parameters and computing cost for getting higher reconstruction accuracy. To address this issue, a lightweight 3D multi scale distillation volumetric Transformer, named Transformer-based dual-attention feature distillation (TDAFD) network, is proposed for 3D MRI by utilizing 3D information hiding in images sufficiently. Our TDAFD network contains several proposed dual-attention feature distillation (DAFD) modules and two designed recursive volumetric Transformers (RVT). Concretely, the proposed DAFD module contains a multi-scale feature distillation (MSFD) block for extracting global features under different scales and a feature enhancement dual attention block (FEDAB) for concentrating on the key features better. In addition, our RVT develops 2D Transformer to 3D and save network's parameters via recursion operations for capturing long-term dependencies in volumetric images effectively. Therefore, our proposed TDAFD network can not only extract deeper features via multi scale feature distillation and Transformer, but also realize the balance of performances and network's parameters. Extensive experiments illustrate that our proposed method achieves superior reconstruction performances than some popular 3D MRI SR methods, and saves number of weights and FLOPs.</p>","PeriodicalId":13073,"journal":{"name":"IEEE Journal of Biomedical and Health Informatics","volume":"PP ","pages":""},"PeriodicalIF":6.7000,"publicationDate":"2025-03-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Journal of Biomedical and Health Informatics","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.1109/JBHI.2025.3555603","RegionNum":2,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
引用次数: 0

Abstract

Although existing 3D super-resolution methods for magnetic resonance imaging (MRI) volumetric data can provide better visual images than some traditional 2D methods, they should face challenge of increasing network's parameters and computing cost for getting higher reconstruction accuracy. To address this issue, a lightweight 3D multi scale distillation volumetric Transformer, named Transformer-based dual-attention feature distillation (TDAFD) network, is proposed for 3D MRI by utilizing 3D information hiding in images sufficiently. Our TDAFD network contains several proposed dual-attention feature distillation (DAFD) modules and two designed recursive volumetric Transformers (RVT). Concretely, the proposed DAFD module contains a multi-scale feature distillation (MSFD) block for extracting global features under different scales and a feature enhancement dual attention block (FEDAB) for concentrating on the key features better. In addition, our RVT develops 2D Transformer to 3D and save network's parameters via recursion operations for capturing long-term dependencies in volumetric images effectively. Therefore, our proposed TDAFD network can not only extract deeper features via multi scale feature distillation and Transformer, but also realize the balance of performances and network's parameters. Extensive experiments illustrate that our proposed method achieves superior reconstruction performances than some popular 3D MRI SR methods, and saves number of weights and FLOPs.

一种用于3D MRI超分辨率的轻量级3D蒸馏体积变压器。
虽然现有的磁共振成像(MRI)体积数据的三维超分辨率方法可以提供比传统二维方法更好的视觉图像,但为了获得更高的重建精度,它们还面临着网络参数增加和计算成本增加的挑战。为了解决这一问题,充分利用隐藏在图像中的三维信息,提出了一种轻量级的三维多尺度蒸馏体积变压器,即基于变压器的双注意力特征蒸馏(TDAFD)网络。我们的TDAFD网络包含几个提出的双注意力特征蒸馏(DAFD)模块和两个设计的递归体积变压器(RVT)。具体而言,该DAFD模块包含用于提取不同尺度下全局特征的多尺度特征蒸馏(MSFD)块和用于更好地集中关键特征的特征增强双注意块(FEDAB)。此外,我们的RVT将2D Transformer开发为3D,并通过递归操作保存网络参数,从而有效地捕获体积图像中的长期依赖关系。因此,我们提出的TDAFD网络不仅可以通过多尺度特征蒸馏和变压器提取更深层次的特征,而且可以实现性能和网络参数的平衡。大量的实验表明,我们的方法比一些流行的3D MRI SR方法具有更好的重建性能,并且节省了权重和FLOPs。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
IEEE Journal of Biomedical and Health Informatics
IEEE Journal of Biomedical and Health Informatics COMPUTER SCIENCE, INFORMATION SYSTEMS-COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS
CiteScore
13.60
自引率
6.50%
发文量
1151
期刊介绍: IEEE Journal of Biomedical and Health Informatics publishes original papers presenting recent advances where information and communication technologies intersect with health, healthcare, life sciences, and biomedicine. Topics include acquisition, transmission, storage, retrieval, management, and analysis of biomedical and health information. The journal covers applications of information technologies in healthcare, patient monitoring, preventive care, early disease diagnosis, therapy discovery, and personalized treatment protocols. It explores electronic medical and health records, clinical information systems, decision support systems, medical and biological imaging informatics, wearable systems, body area/sensor networks, and more. Integration-related topics like interoperability, evidence-based medicine, and secure patient data are also addressed.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信