用于射电天文学快速精确成像的 R2D2 深度神经网络系列范例

Amir Aghabiglou, Chung San Chu, Arwa Dabbech and Yves Wiaux
{"title":"用于射电天文学快速精确成像的 R2D2 深度神经网络系列范例","authors":"Amir Aghabiglou, Chung San Chu, Arwa Dabbech and Yves Wiaux","doi":"10.3847/1538-4365/ad46f5","DOIUrl":null,"url":null,"abstract":"Radio-interferometric imaging entails solving high-resolution high-dynamic-range inverse problems from large data volumes. Recent image reconstruction techniques grounded in optimization theory have demonstrated remarkable capability for imaging precision, well beyond CLEAN’s capability. These range from advanced proximal algorithms propelled by handcrafted regularization operators, such as the SARA family, to hybrid plug-and-play (PnP) algorithms propelled by learned regularization denoisers, such as AIRI. Optimization and PnP structures are however highly iterative, which hinders their ability to handle the extreme data sizes expected from future instruments. To address this scalability challenge, we introduce a novel deep-learning approach, dubbed “Residual-to-Residual DNN series for high-Dynamic-range imaging” or in short R2D2. R2D2's reconstruction is formed as a series of residual images, iteratively estimated as outputs of deep neural networks (DNNs) taking the previous iteration’s image estimate and associated data residual as inputs. It thus takes a hybrid structure between a PnP algorithm and a learned version of the matching pursuit algorithm that underpins CLEAN. We present a comprehensive study of our approach, featuring its multiple incarnations distinguished by their DNN architectures. We provide a detailed description of its training process, targeting a telescope-specific approach. R2D2's capability to deliver high precision is demonstrated in simulation, across a variety of image and observation settings using the Very Large Array. Its reconstruction speed is also demonstrated: with only a few iterations required to clean data residuals at dynamic ranges up to 105, R2D2 opens the door to fast precision imaging. R2D2 codes are available in the BASPLib (https://basp-group.github.io/BASPLib/) library on GitHub.","PeriodicalId":22368,"journal":{"name":"The Astrophysical Journal Supplement Series","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-06-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"The R2D2 Deep Neural Network Series Paradigm for Fast Precision Imaging in Radio Astronomy\",\"authors\":\"Amir Aghabiglou, Chung San Chu, Arwa Dabbech and Yves Wiaux\",\"doi\":\"10.3847/1538-4365/ad46f5\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Radio-interferometric imaging entails solving high-resolution high-dynamic-range inverse problems from large data volumes. Recent image reconstruction techniques grounded in optimization theory have demonstrated remarkable capability for imaging precision, well beyond CLEAN’s capability. These range from advanced proximal algorithms propelled by handcrafted regularization operators, such as the SARA family, to hybrid plug-and-play (PnP) algorithms propelled by learned regularization denoisers, such as AIRI. Optimization and PnP structures are however highly iterative, which hinders their ability to handle the extreme data sizes expected from future instruments. To address this scalability challenge, we introduce a novel deep-learning approach, dubbed “Residual-to-Residual DNN series for high-Dynamic-range imaging” or in short R2D2. R2D2's reconstruction is formed as a series of residual images, iteratively estimated as outputs of deep neural networks (DNNs) taking the previous iteration’s image estimate and associated data residual as inputs. It thus takes a hybrid structure between a PnP algorithm and a learned version of the matching pursuit algorithm that underpins CLEAN. We present a comprehensive study of our approach, featuring its multiple incarnations distinguished by their DNN architectures. We provide a detailed description of its training process, targeting a telescope-specific approach. R2D2's capability to deliver high precision is demonstrated in simulation, across a variety of image and observation settings using the Very Large Array. Its reconstruction speed is also demonstrated: with only a few iterations required to clean data residuals at dynamic ranges up to 105, R2D2 opens the door to fast precision imaging. R2D2 codes are available in the BASPLib (https://basp-group.github.io/BASPLib/) library on GitHub.\",\"PeriodicalId\":22368,\"journal\":{\"name\":\"The Astrophysical Journal Supplement Series\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-06-19\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"The Astrophysical Journal Supplement Series\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.3847/1538-4365/ad46f5\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"The Astrophysical Journal Supplement Series","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.3847/1538-4365/ad46f5","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

无线电干涉成像需要从大量数据中解决高分辨率、高动态范围的逆问题。以优化理论为基础的最新图像重建技术在成像精度方面表现出了非凡的能力,远远超出了 CLEAN 的能力范围。这些技术包括由手工正则化算子(如 SARA 系列)推动的先进近端算法,以及由学习正则化去噪器(如 AIRI)推动的混合即插即用(PnP)算法。然而,优化和 PnP 结构都是高度迭代的,这阻碍了它们处理未来仪器所预期的极端数据规模的能力。为了应对这一可扩展性挑战,我们引入了一种新的深度学习方法,称为 "用于高动态范围成像的残差到残差 DNN 系列",简称 R2D2。R2D2 的重建是由一系列残差图像组成的,这些残差图像作为深度神经网络(DNN)的输出进行迭代估计,并将上一次迭代的图像估计和相关数据残差作为输入。因此,它采用了 PnP 算法和作为 CLEAN 基础的匹配追求算法的学习版本之间的混合结构。我们对我们的方法进行了全面的研究,介绍了以 DNN 架构区分的多种化身。我们详细描述了针对特定望远镜的训练过程。在使用超大型阵列进行的各种图像和观测设置中,模拟演示了 R2D2 的高精度能力。R2D2 的重建速度也得到了证明:在动态范围高达 105 的情况下,R2D2 只需要几次迭代就能清除数据残差,为快速精确成像打开了大门。R2D2 代码可从 GitHub 上的 BASPLib (https://basp-group.github.io/BASPLib/) 库中获取。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
The R2D2 Deep Neural Network Series Paradigm for Fast Precision Imaging in Radio Astronomy
Radio-interferometric imaging entails solving high-resolution high-dynamic-range inverse problems from large data volumes. Recent image reconstruction techniques grounded in optimization theory have demonstrated remarkable capability for imaging precision, well beyond CLEAN’s capability. These range from advanced proximal algorithms propelled by handcrafted regularization operators, such as the SARA family, to hybrid plug-and-play (PnP) algorithms propelled by learned regularization denoisers, such as AIRI. Optimization and PnP structures are however highly iterative, which hinders their ability to handle the extreme data sizes expected from future instruments. To address this scalability challenge, we introduce a novel deep-learning approach, dubbed “Residual-to-Residual DNN series for high-Dynamic-range imaging” or in short R2D2. R2D2's reconstruction is formed as a series of residual images, iteratively estimated as outputs of deep neural networks (DNNs) taking the previous iteration’s image estimate and associated data residual as inputs. It thus takes a hybrid structure between a PnP algorithm and a learned version of the matching pursuit algorithm that underpins CLEAN. We present a comprehensive study of our approach, featuring its multiple incarnations distinguished by their DNN architectures. We provide a detailed description of its training process, targeting a telescope-specific approach. R2D2's capability to deliver high precision is demonstrated in simulation, across a variety of image and observation settings using the Very Large Array. Its reconstruction speed is also demonstrated: with only a few iterations required to clean data residuals at dynamic ranges up to 105, R2D2 opens the door to fast precision imaging. R2D2 codes are available in the BASPLib (https://basp-group.github.io/BASPLib/) library on GitHub.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信