MotionTTT:三维运动校正磁共振成像的二维测试-时间-训练运动估计

Tobit Klug, Kun Wang, Stefan Ruschke, Reinhard Heckel
{"title":"MotionTTT:三维运动校正磁共振成像的二维测试-时间-训练运动估计","authors":"Tobit Klug, Kun Wang, Stefan Ruschke, Reinhard Heckel","doi":"arxiv-2409.09370","DOIUrl":null,"url":null,"abstract":"A major challenge of the long measurement times in magnetic resonance imaging\n(MRI), an important medical imaging technology, is that patients may move\nduring data acquisition. This leads to severe motion artifacts in the\nreconstructed images and volumes. In this paper, we propose a deep\nlearning-based test-time-training method for accurate motion estimation. The\nkey idea is that a neural network trained for motion-free reconstruction has a\nsmall loss if there is no motion, thus optimizing over motion parameters passed\nthrough the reconstruction network enables accurate estimation of motion. The\nestimated motion parameters enable to correct for the motion and to reconstruct\naccurate motion-corrected images. Our method uses 2D reconstruction networks to\nestimate rigid motion in 3D, and constitutes the first deep learning based\nmethod for 3D rigid motion estimation towards 3D-motion-corrected MRI. We show\nthat our method can provably reconstruct motion parameters for a simple signal\nand neural network model. We demonstrate the effectiveness of our method for\nboth retrospectively simulated motion and prospectively collected real\nmotion-corrupted data.","PeriodicalId":501289,"journal":{"name":"arXiv - EE - Image and Video Processing","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-09-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"MotionTTT: 2D Test-Time-Training Motion Estimation for 3D Motion Corrected MRI\",\"authors\":\"Tobit Klug, Kun Wang, Stefan Ruschke, Reinhard Heckel\",\"doi\":\"arxiv-2409.09370\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"A major challenge of the long measurement times in magnetic resonance imaging\\n(MRI), an important medical imaging technology, is that patients may move\\nduring data acquisition. This leads to severe motion artifacts in the\\nreconstructed images and volumes. In this paper, we propose a deep\\nlearning-based test-time-training method for accurate motion estimation. The\\nkey idea is that a neural network trained for motion-free reconstruction has a\\nsmall loss if there is no motion, thus optimizing over motion parameters passed\\nthrough the reconstruction network enables accurate estimation of motion. The\\nestimated motion parameters enable to correct for the motion and to reconstruct\\naccurate motion-corrected images. Our method uses 2D reconstruction networks to\\nestimate rigid motion in 3D, and constitutes the first deep learning based\\nmethod for 3D rigid motion estimation towards 3D-motion-corrected MRI. We show\\nthat our method can provably reconstruct motion parameters for a simple signal\\nand neural network model. We demonstrate the effectiveness of our method for\\nboth retrospectively simulated motion and prospectively collected real\\nmotion-corrupted data.\",\"PeriodicalId\":501289,\"journal\":{\"name\":\"arXiv - EE - Image and Video Processing\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-09-14\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv - EE - Image and Video Processing\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/arxiv-2409.09370\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - EE - Image and Video Processing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.09370","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

磁共振成像(MRI)是一项重要的医学成像技术,其测量时间长的一大挑战是患者可能会移动数据采集。这会导致所构建的图像和体量出现严重的运动伪影。在本文中,我们提出了一种基于深度学习的测试时间训练方法,用于精确运动估计。其主要思想是,为无运动重建训练的神经网络在没有运动的情况下损失很小,因此通过重建网络对运动参数进行优化就能准确估计运动。估算出的运动参数可以校正运动,重建精确的运动校正图像。我们的方法使用二维重建网络来估计三维刚性运动,是第一种基于深度学习的三维刚性运动估计方法,可用于三维运动校正核磁共振成像。我们的研究表明,我们的方法可以为一个简单的信号和神经网络模型重建运动参数。我们证明了我们的方法在回溯模拟运动和前瞻性收集运动损伤数据方面的有效性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
MotionTTT: 2D Test-Time-Training Motion Estimation for 3D Motion Corrected MRI
A major challenge of the long measurement times in magnetic resonance imaging (MRI), an important medical imaging technology, is that patients may move during data acquisition. This leads to severe motion artifacts in the reconstructed images and volumes. In this paper, we propose a deep learning-based test-time-training method for accurate motion estimation. The key idea is that a neural network trained for motion-free reconstruction has a small loss if there is no motion, thus optimizing over motion parameters passed through the reconstruction network enables accurate estimation of motion. The estimated motion parameters enable to correct for the motion and to reconstruct accurate motion-corrected images. Our method uses 2D reconstruction networks to estimate rigid motion in 3D, and constitutes the first deep learning based method for 3D rigid motion estimation towards 3D-motion-corrected MRI. We show that our method can provably reconstruct motion parameters for a simple signal and neural network model. We demonstrate the effectiveness of our method for both retrospectively simulated motion and prospectively collected real motion-corrupted data.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信