A reconstruction method for ptychography based on residual dense network.

IF 1.7 3区 医学 Q3 INSTRUMENTS & INSTRUMENTATION
Mengnan Liu, Yu Han, Xiaoqi Xi, Lei Li, Zijian Xu, Xiangzhi Zhang, Linlin Zhu, Bin Yan
{"title":"A reconstruction method for ptychography based on residual dense network.","authors":"Mengnan Liu, Yu Han, Xiaoqi Xi, Lei Li, Zijian Xu, Xiangzhi Zhang, Linlin Zhu, Bin Yan","doi":"10.3233/XST-240114","DOIUrl":null,"url":null,"abstract":"<p><strong>Background: </strong>Coherent diffraction imaging (CDI) is an important lens-free imaging method. As a variant of CDI, ptychography enables the imaging of objects with arbitrary lateral sizes. However, traditional phase retrieval methods are time-consuming for ptychographic imaging of large-size objects, e.g., integrated circuits (IC). Especially when ptychography is combined with computed tomography (CT) or computed laminography (CL), time consumption increases greatly.</p><p><strong>Objective: </strong>In this work, we aim to propose a new deep learning-based approach to implement a quick and robust reconstruction of ptychography.</p><p><strong>Methods: </strong>Inspired by the strong advantages of the residual dense network for computer vision tasks, we propose a dense residual two-branch network (RDenPtycho) based on the ptychography two-branch reconstruction architecture for the fast and robust reconstruction of ptychography. The network relies on the residual dense block to construct mappings from diffraction patterns to amplitudes and phases. In addition, we integrate the physical processes of ptychography into the training of the network to further improve the performance.</p><p><strong>Results: </strong>The proposed RDenPtycho is evaluated using the publicly available ptychography dataset from the Advanced Photon Source. The results show that the proposed method can faithfully and robustly recover the detailed information of the objects. Ablation experiments demonstrate the effectiveness of the components in the proposed method for performance enhancement.</p><p><strong>Significance: </strong>The proposed method enables fast, accurate, and robust reconstruction of ptychography, and is of potential significance for 3D ptychography. The proposed method and experiments can resolve similar problems in other fields.</p>","PeriodicalId":49948,"journal":{"name":"Journal of X-Ray Science and Technology","volume":" ","pages":"1505-1519"},"PeriodicalIF":1.7000,"publicationDate":"2024-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of X-Ray Science and Technology","FirstCategoryId":"3","ListUrlMain":"https://doi.org/10.3233/XST-240114","RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"INSTRUMENTS & INSTRUMENTATION","Score":null,"Total":0}
引用次数: 0

Abstract

Background: Coherent diffraction imaging (CDI) is an important lens-free imaging method. As a variant of CDI, ptychography enables the imaging of objects with arbitrary lateral sizes. However, traditional phase retrieval methods are time-consuming for ptychographic imaging of large-size objects, e.g., integrated circuits (IC). Especially when ptychography is combined with computed tomography (CT) or computed laminography (CL), time consumption increases greatly.

Objective: In this work, we aim to propose a new deep learning-based approach to implement a quick and robust reconstruction of ptychography.

Methods: Inspired by the strong advantages of the residual dense network for computer vision tasks, we propose a dense residual two-branch network (RDenPtycho) based on the ptychography two-branch reconstruction architecture for the fast and robust reconstruction of ptychography. The network relies on the residual dense block to construct mappings from diffraction patterns to amplitudes and phases. In addition, we integrate the physical processes of ptychography into the training of the network to further improve the performance.

Results: The proposed RDenPtycho is evaluated using the publicly available ptychography dataset from the Advanced Photon Source. The results show that the proposed method can faithfully and robustly recover the detailed information of the objects. Ablation experiments demonstrate the effectiveness of the components in the proposed method for performance enhancement.

Significance: The proposed method enables fast, accurate, and robust reconstruction of ptychography, and is of potential significance for 3D ptychography. The proposed method and experiments can resolve similar problems in other fields.

一种基于残差密集网络的平面图重建方法。
背景:相干衍射成像(CDI)是一种重要的无透镜成像方法。作为CDI的一种变体,平面照相术可以对任意横向大小的物体进行成像。然而,传统的相位恢复方法对于大尺寸物体,如集成电路(IC)的平面成像是耗时的。尤其是与计算机断层扫描(CT)或计算机层析成像(CL)相结合时,时间消耗大大增加。目的:在这项工作中,我们的目标是提出一种新的基于深度学习的方法来实现快速和鲁棒的构造图重建。方法:利用残差密集网络在计算机视觉任务中的强大优势,提出了一种基于灰度图两分支重构架构的密集残差两分支网络(RDenPtycho),以实现灰度图的快速鲁棒重构。该网络依赖于残差密集块来构建从衍射图样到振幅和相位的映射。此外,我们将平面摄影的物理过程融入到网络的训练中,进一步提高了网络的性能。结果:RDenPtycho使用来自Advanced Photon Source的公开可用的ptychography数据集进行评估。实验结果表明,该方法能够真实、稳健地恢复目标的详细信息。烧蚀实验证明了所提出的方法对提高性能的有效性。意义:该方法能够快速、准确、鲁棒地重建平面摄影,对三维平面摄影具有潜在意义。所提出的方法和实验可以解决其他领域的类似问题。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
4.90
自引率
23.30%
发文量
150
审稿时长
3 months
期刊介绍: Research areas within the scope of the journal include: Interaction of x-rays with matter: x-ray phenomena, biological effects of radiation, radiation safety and optical constants X-ray sources: x-rays from synchrotrons, x-ray lasers, plasmas, and other sources, conventional or unconventional Optical elements: grazing incidence optics, multilayer mirrors, zone plates, gratings, other diffraction optics Optical instruments: interferometers, spectrometers, microscopes, telescopes, microprobes
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信