Sparsity-based PET image reconstruction using MRI learned dictionaries

Jing Tang, Yanhua Wang, R. Yao, L. Ying
{"title":"Sparsity-based PET image reconstruction using MRI learned dictionaries","authors":"Jing Tang, Yanhua Wang, R. Yao, L. Ying","doi":"10.1109/ISBI.2014.6868063","DOIUrl":null,"url":null,"abstract":"Incorporating anatomical information obtained by magnetic resonance (MR) imaging has shown its promises to improve the positron emission tomography (PET) imaging quality. In this paper, we propose a novel maximum a posteriori (MAP) PET image reconstruction technique using a sparse prior whose dictionary is learned from the corresponding MR images. Specifically, a PET image is divided into three-dimensional overlapping patches which are expected to be sparsely represented over a redundant dictionary. With the assumption that the PET and MR images of a patient can be sparsified under a common dictionary, the dictionary is learned from the MR image to involve anatomical measurement in PET image reconstruction. The PET image and its sparse representation are updated alternately in the iterative reconstruction process. We evaluated the performance of the proposed method quantitatively, using a realistic simulation with the BrainWeb database phantoms. Noticeable improvement on the noise versus bias tradeoff has been demonstrated in images reconstructed from the proposed method, compared to that from the conventional smoothness MAP method.","PeriodicalId":440405,"journal":{"name":"2014 IEEE 11th International Symposium on Biomedical Imaging (ISBI)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2014-07-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"17","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2014 IEEE 11th International Symposium on Biomedical Imaging (ISBI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISBI.2014.6868063","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 17

Abstract

Incorporating anatomical information obtained by magnetic resonance (MR) imaging has shown its promises to improve the positron emission tomography (PET) imaging quality. In this paper, we propose a novel maximum a posteriori (MAP) PET image reconstruction technique using a sparse prior whose dictionary is learned from the corresponding MR images. Specifically, a PET image is divided into three-dimensional overlapping patches which are expected to be sparsely represented over a redundant dictionary. With the assumption that the PET and MR images of a patient can be sparsified under a common dictionary, the dictionary is learned from the MR image to involve anatomical measurement in PET image reconstruction. The PET image and its sparse representation are updated alternately in the iterative reconstruction process. We evaluated the performance of the proposed method quantitatively, using a realistic simulation with the BrainWeb database phantoms. Noticeable improvement on the noise versus bias tradeoff has been demonstrated in images reconstructed from the proposed method, compared to that from the conventional smoothness MAP method.
基于稀疏性的PET图像重建利用MRI学习字典
结合磁共振(MR)成像获得的解剖信息显示出其提高正电子发射断层扫描(PET)成像质量的前景。在本文中,我们提出了一种新的最大后验(MAP) PET图像重建技术,该技术使用稀疏先验,其字典从相应的MR图像中学习。具体来说,PET图像被分割成三维重叠的小块,这些小块被期望在冗余字典上稀疏地表示。假设患者的PET和MR图像可以在一个共同的字典下进行稀疏化,从MR图像中学习字典,在PET图像重建中加入解剖测量。在迭代重建过程中,PET图像及其稀疏表示交替更新。我们定量地评估了所提出方法的性能,使用了BrainWeb数据库模型的真实模拟。与传统的平滑MAP方法相比,该方法重建的图像在噪声与偏置权衡方面有了明显的改善。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信