{"title":"Multi-channel MRI reconstruction using cascaded Swinμ transformers with overlapped attention.","authors":"Tahsin Rahman, Ali Bilgin, Sergio D Cabrera","doi":"10.1088/1361-6560/adb933","DOIUrl":null,"url":null,"abstract":"<p><p><i>Objective.</i>Deep neural networks have been shown to be very effective at artifact reduction tasks such as magnetic resonance imaging (MRI) reconstruction from undersampled k-space data. In recent years, attention-based vision transformer models have been shown to outperform purely convolutional models at a wide variety of tasks, including MRI reconstruction. Our objective is to investigate the use of different transformer architectures for multi-channel cascaded MRI reconstruction.<i>Approach.</i>In this work, we explore the effective use of cascades of small transformers in multi-channel undersampled MRI reconstruction. We introduce overlapped attention and compare it to hybrid attention in shifted-window (Swin) transformers. We also investigate the impact of the number of Swin transformer layers in each architecture. The proposed methods are compared to state-of-the-art MRI reconstruction methods for undersampled reconstruction on standard 3T and low-field (0.3T) T1-weighted MRI images at multiple acceleration rates.<i>Main results.</i>The models with overlapped attention achieve significantly higher or equivalent quantitative test metrics compared to state-of-the-art convolutional approaches. They also show more consistent reconstruction performance across different acceleration rates compared to their hybrid attention counterparts. We have also shown that transformer architectures with fewer layers can be as effective as those with more layers when used in cascaded MRI reconstruction problems.<i>Significance.</i>The feasibility and effectiveness of cascades of small transformers with overlapped attention for MRI reconstruction is demonstrated without incorporating pre-training of the transformer on ImageNet or other large-scale datasets.</p>","PeriodicalId":20185,"journal":{"name":"Physics in medicine and biology","volume":"70 7","pages":""},"PeriodicalIF":3.3000,"publicationDate":"2025-03-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Physics in medicine and biology","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.1088/1361-6560/adb933","RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENGINEERING, BIOMEDICAL","Score":null,"Total":0}
引用次数: 0
Abstract
Objective.Deep neural networks have been shown to be very effective at artifact reduction tasks such as magnetic resonance imaging (MRI) reconstruction from undersampled k-space data. In recent years, attention-based vision transformer models have been shown to outperform purely convolutional models at a wide variety of tasks, including MRI reconstruction. Our objective is to investigate the use of different transformer architectures for multi-channel cascaded MRI reconstruction.Approach.In this work, we explore the effective use of cascades of small transformers in multi-channel undersampled MRI reconstruction. We introduce overlapped attention and compare it to hybrid attention in shifted-window (Swin) transformers. We also investigate the impact of the number of Swin transformer layers in each architecture. The proposed methods are compared to state-of-the-art MRI reconstruction methods for undersampled reconstruction on standard 3T and low-field (0.3T) T1-weighted MRI images at multiple acceleration rates.Main results.The models with overlapped attention achieve significantly higher or equivalent quantitative test metrics compared to state-of-the-art convolutional approaches. They also show more consistent reconstruction performance across different acceleration rates compared to their hybrid attention counterparts. We have also shown that transformer architectures with fewer layers can be as effective as those with more layers when used in cascaded MRI reconstruction problems.Significance.The feasibility and effectiveness of cascades of small transformers with overlapped attention for MRI reconstruction is demonstrated without incorporating pre-training of the transformer on ImageNet or other large-scale datasets.
期刊介绍:
The development and application of theoretical, computational and experimental physics to medicine, physiology and biology. Topics covered are: therapy physics (including ionizing and non-ionizing radiation); biomedical imaging (e.g. x-ray, magnetic resonance, ultrasound, optical and nuclear imaging); image-guided interventions; image reconstruction and analysis (including kinetic modelling); artificial intelligence in biomedical physics and analysis; nanoparticles in imaging and therapy; radiobiology; radiation protection and patient dose monitoring; radiation dosimetry