Unsupervised Photoacoustic Tomography Image Reconstruction from Limited-View Unpaired Data using an Improved CycleGAN

Hesam Hakimnejad, Z. Azimifar, Mohammad Sadegh Nazemi
{"title":"Unsupervised Photoacoustic Tomography Image Reconstruction from Limited-View Unpaired Data using an Improved CycleGAN","authors":"Hesam Hakimnejad, Z. Azimifar, Mohammad Sadegh Nazemi","doi":"10.1109/CSICC58665.2023.10105363","DOIUrl":null,"url":null,"abstract":"Photoacoustic tomography (PAT) is a hybrid imaging method with great applications in preclinical research and clinical applications. However, due to the limited-view issue, it is often hard to cover the desired tissue completely, thus resulting in severe artifacts in reconstructed images. Enhancing a reconstructed image to become artifact-free could be considered an image-to-image translation task which is addressed easily by the well-known Pix2Pix generative adversarial network (GAN). Training Pix2Pix usually requires a large paired dataset. Preparing such datasets can be difficult or even in some cases impossible. In this paper, we propose an improved unsupervised reconstruction method based on cycle-consistent adversarial networks (CycleGAN), to overcome the need for paired datasets. CycleGAN can learn image-to-image translation tasks from an unpaired dataset without the need for one-to-one matching between low-quality and high-quality images. Experimental results demonstrate that the proposed architecture outperforms the original CycleGAN in terms of image similarity metrics including PSNR and SSIM.","PeriodicalId":127277,"journal":{"name":"2023 28th International Computer Conference, Computer Society of Iran (CSICC)","volume":"33 6 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-01-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 28th International Computer Conference, Computer Society of Iran (CSICC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CSICC58665.2023.10105363","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Photoacoustic tomography (PAT) is a hybrid imaging method with great applications in preclinical research and clinical applications. However, due to the limited-view issue, it is often hard to cover the desired tissue completely, thus resulting in severe artifacts in reconstructed images. Enhancing a reconstructed image to become artifact-free could be considered an image-to-image translation task which is addressed easily by the well-known Pix2Pix generative adversarial network (GAN). Training Pix2Pix usually requires a large paired dataset. Preparing such datasets can be difficult or even in some cases impossible. In this paper, we propose an improved unsupervised reconstruction method based on cycle-consistent adversarial networks (CycleGAN), to overcome the need for paired datasets. CycleGAN can learn image-to-image translation tasks from an unpaired dataset without the need for one-to-one matching between low-quality and high-quality images. Experimental results demonstrate that the proposed architecture outperforms the original CycleGAN in terms of image similarity metrics including PSNR and SSIM.
使用改进的CycleGAN从有限视点未配对数据中重建无监督光声断层成像图像
光声断层成像(PAT)是一种混合成像方法,在临床前研究和临床应用中都有很大的应用。然而,由于有限的视野问题,通常很难完全覆盖所需的组织,从而导致重建图像中的严重伪影。增强重建图像以使其无伪像可以被认为是一种图像到图像的转换任务,该任务可以通过著名的Pix2Pix生成对抗网络(GAN)轻松解决。训练Pix2Pix通常需要一个大的配对数据集。准备这样的数据集可能很困难,甚至在某些情况下是不可能的。在本文中,我们提出了一种改进的基于循环一致对抗网络(CycleGAN)的无监督重建方法,以克服对成对数据集的需求。CycleGAN可以从未配对的数据集中学习图像到图像的翻译任务,而不需要在低质量和高质量图像之间进行一对一的匹配。实验结果表明,该结构在图像相似度指标(包括PSNR和SSIM)方面优于原始CycleGAN。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信