推进磁共振成像扫描中的脑肿瘤分割:带有变压器块的混合注意力-惯性 UNET

Sobha Xavier P, Sathish P K, Raju G
{"title":"推进磁共振成像扫描中的脑肿瘤分割:带有变压器块的混合注意力-惯性 UNET","authors":"Sobha Xavier P, Sathish P K, Raju G","doi":"10.3991/ijoe.v20i06.46979","DOIUrl":null,"url":null,"abstract":"Accurate segmentation of brain tumors is vital for effective treatment planning, disease diagnosis, and monitoring treatment outcomes. Post-surgical monitoring, particularly for recurring tumors, relies on MRI scans, presenting challenges in segmenting small residual tumors due to surgical artifacts. This emphasizes the need for a robust model with superior feature extraction capabilities for precise segmentation in both pre- and post-operative scenarios. The study introduces the Hybrid Attention-Residual UNET with Transformer Blocks (HART-UNet), enhancing the U-Net architecture with a spatial self-attention module, deep residual connections, and RESNET50 weights. Trained on BRATS’20 and validated on Kaggle LGG and BTC_ postop datasets, HART-UNet outperforms established models (UNET, Attention UNET, UNET++, and RESNET 50), achieving Dice Coefficients of 0.96, 0.97, and 0.88, respectively. These results underscore the model’s superior segmentation performance, marking a significant advancement in brain tumor analysis across pre- and post-operative MRI scans.","PeriodicalId":507997,"journal":{"name":"International Journal of Online and Biomedical Engineering (iJOE)","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-04-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Advancing Brain Tumor Segmentation in MRI Scans: Hybrid Attention-Residual UNET with Transformer Blocks\",\"authors\":\"Sobha Xavier P, Sathish P K, Raju G\",\"doi\":\"10.3991/ijoe.v20i06.46979\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Accurate segmentation of brain tumors is vital for effective treatment planning, disease diagnosis, and monitoring treatment outcomes. Post-surgical monitoring, particularly for recurring tumors, relies on MRI scans, presenting challenges in segmenting small residual tumors due to surgical artifacts. This emphasizes the need for a robust model with superior feature extraction capabilities for precise segmentation in both pre- and post-operative scenarios. The study introduces the Hybrid Attention-Residual UNET with Transformer Blocks (HART-UNet), enhancing the U-Net architecture with a spatial self-attention module, deep residual connections, and RESNET50 weights. Trained on BRATS’20 and validated on Kaggle LGG and BTC_ postop datasets, HART-UNet outperforms established models (UNET, Attention UNET, UNET++, and RESNET 50), achieving Dice Coefficients of 0.96, 0.97, and 0.88, respectively. These results underscore the model’s superior segmentation performance, marking a significant advancement in brain tumor analysis across pre- and post-operative MRI scans.\",\"PeriodicalId\":507997,\"journal\":{\"name\":\"International Journal of Online and Biomedical Engineering (iJOE)\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-04-12\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International Journal of Online and Biomedical Engineering (iJOE)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.3991/ijoe.v20i06.46979\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Online and Biomedical Engineering (iJOE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.3991/ijoe.v20i06.46979","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

脑肿瘤的精确分割对于有效的治疗计划、疾病诊断和治疗效果监测至关重要。手术后的监测,尤其是复发肿瘤的监测,依赖于核磁共振成像扫描,由于手术伪影,在分割小的残余肿瘤时面临挑战。这就强调了需要一个具有卓越特征提取能力的稳健模型,以便在术前和术后场景中进行精确分割。本研究引入了带变压器块的混合注意力-残余 UNET(HART-UNet),通过空间自注意力模块、深度残余连接和 RESNET50 权重增强了 U-Net 架构。HART-UNet 在 BRATS'20 上进行了训练,并在 Kaggle LGG 和 BTC_ 术后数据集上进行了验证,其表现优于现有模型(UNET、Attention UNET、UNET++ 和 RESNET 50),骰子系数分别达到 0.96、0.97 和 0.88。这些结果凸显了该模型卓越的分割性能,标志着脑肿瘤分析在术前和术后核磁共振扫描中取得了重大进展。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Advancing Brain Tumor Segmentation in MRI Scans: Hybrid Attention-Residual UNET with Transformer Blocks
Accurate segmentation of brain tumors is vital for effective treatment planning, disease diagnosis, and monitoring treatment outcomes. Post-surgical monitoring, particularly for recurring tumors, relies on MRI scans, presenting challenges in segmenting small residual tumors due to surgical artifacts. This emphasizes the need for a robust model with superior feature extraction capabilities for precise segmentation in both pre- and post-operative scenarios. The study introduces the Hybrid Attention-Residual UNET with Transformer Blocks (HART-UNet), enhancing the U-Net architecture with a spatial self-attention module, deep residual connections, and RESNET50 weights. Trained on BRATS’20 and validated on Kaggle LGG and BTC_ postop datasets, HART-UNet outperforms established models (UNET, Attention UNET, UNET++, and RESNET 50), achieving Dice Coefficients of 0.96, 0.97, and 0.88, respectively. These results underscore the model’s superior segmentation performance, marking a significant advancement in brain tumor analysis across pre- and post-operative MRI scans.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信