Bayesian view on the training of invertible residual networks for solving linear inverse problems *

IF 2 2区 数学 Q1 MATHEMATICS, APPLIED
Clemens Arndt, Sören Dittmer, Nick Heilenkötter, Meira Iske, Tobias Kluth, Judith Nickel
{"title":"Bayesian view on the training of invertible residual networks for solving linear inverse problems *","authors":"Clemens Arndt, Sören Dittmer, Nick Heilenkötter, Meira Iske, Tobias Kluth, Judith Nickel","doi":"10.1088/1361-6420/ad2aaa","DOIUrl":null,"url":null,"abstract":"Learning-based methods for inverse problems, adapting to the data’s inherent structure, have become ubiquitous in the last decade. Besides empirical investigations of their often remarkable performance, an increasing number of works address the issue of theoretical guarantees. Recently, Arndt <italic toggle=\"yes\">et al</italic> (2023 <italic toggle=\"yes\">Inverse Problems</italic>\n<bold>39</bold> 125018) exploited invertible residual networks (iResNets) to learn provably convergent regularizations given reasonable assumptions. They enforced these guarantees by approximating the linear forward operator with an iResNet. Supervised training on relevant samples introduces data dependency into the approach. An open question in this context is to which extent the data’s inherent structure influences the training outcome, i.e. the learned reconstruction scheme. Here, we address this delicate interplay of training design and data dependency from a Bayesian perspective and shed light on opportunities and limitations. We resolve these limitations by analyzing reconstruction-based training of the inverses of iResNets, where we show that this optimization strategy introduces a level of data-dependency that cannot be achieved by approximation training. We further provide and discuss a series of numerical experiments underpinning and extending the theoretical findings.","PeriodicalId":50275,"journal":{"name":"Inverse Problems","volume":null,"pages":null},"PeriodicalIF":2.0000,"publicationDate":"2024-03-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Inverse Problems","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1088/1361-6420/ad2aaa","RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MATHEMATICS, APPLIED","Score":null,"Total":0}
引用次数: 0

Abstract

Learning-based methods for inverse problems, adapting to the data’s inherent structure, have become ubiquitous in the last decade. Besides empirical investigations of their often remarkable performance, an increasing number of works address the issue of theoretical guarantees. Recently, Arndt et al (2023 Inverse Problems 39 125018) exploited invertible residual networks (iResNets) to learn provably convergent regularizations given reasonable assumptions. They enforced these guarantees by approximating the linear forward operator with an iResNet. Supervised training on relevant samples introduces data dependency into the approach. An open question in this context is to which extent the data’s inherent structure influences the training outcome, i.e. the learned reconstruction scheme. Here, we address this delicate interplay of training design and data dependency from a Bayesian perspective and shed light on opportunities and limitations. We resolve these limitations by analyzing reconstruction-based training of the inverses of iResNets, where we show that this optimization strategy introduces a level of data-dependency that cannot be achieved by approximation training. We further provide and discuss a series of numerical experiments underpinning and extending the theoretical findings.
贝叶斯视角下用于解决线性逆问题的可逆残差网络的训练 *
在过去十年中,基于学习的逆问题方法已经变得无处不在,这些方法能够适应数据的固有结构。除了对这些方法的卓越性能进行实证研究外,越来越多的研究还涉及理论保证问题。最近,Arndt 等人(2023 逆问题 39 125018)利用可逆残差网络(iResNets)在合理假设条件下学习可证明收敛的正则化。他们通过 iResNet 近似线性前向算子来实现这些保证。对相关样本的监督训练将数据依赖性引入到该方法中。在这种情况下,一个悬而未决的问题是,数据的固有结构会在多大程度上影响训练结果,即学习到的重构方案。在这里,我们从贝叶斯的角度探讨了训练设计和数据依赖性之间的微妙相互作用,并阐明了机会和局限性。我们通过分析基于重构的 iResNets 逆向训练来解决这些局限性,并证明这种优化策略会引入一定程度的数据依赖性,而这种依赖性是近似训练所无法实现的。我们还进一步提供并讨论了一系列数值实验,以支持并扩展理论发现。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Inverse Problems
Inverse Problems 数学-物理:数学物理
CiteScore
4.40
自引率
14.30%
发文量
115
审稿时长
2.3 months
期刊介绍: An interdisciplinary journal combining mathematical and experimental papers on inverse problems with theoretical, numerical and practical approaches to their solution. As well as applied mathematicians, physical scientists and engineers, the readership includes those working in geophysics, radar, optics, biology, acoustics, communication theory, signal processing and imaging, among others. The emphasis is on publishing original contributions to methods of solving mathematical, physical and applied problems. To be publishable in this journal, papers must meet the highest standards of scientific quality, contain significant and original new science and should present substantial advancement in the field. Due to the broad scope of the journal, we require that authors provide sufficient introductory material to appeal to the wide readership and that articles which are not explicitly applied include a discussion of possible applications.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信