逆问题的贝叶斯解及其与Backus-Gilbert方法的关系

IF 4.8 2区 物理与天体物理 Q2 PHYSICS, PARTICLES & FIELDS
Luigi Del Debbio, Alessandro Lupo, Marco Panero, Nazario Tantalo
{"title":"逆问题的贝叶斯解及其与Backus-Gilbert方法的关系","authors":"Luigi Del Debbio,&nbsp;Alessandro Lupo,&nbsp;Marco Panero,&nbsp;Nazario Tantalo","doi":"10.1140/epjc/s10052-025-13885-9","DOIUrl":null,"url":null,"abstract":"<div><p>The problem of obtaining spectral densities from lattice data has been receiving great attention due to its importance in our understanding of scattering processes in Quantum Field Theory, with applications both in the Standard Model and beyond. The problem is notoriously difficult as it amounts to performing an inverse Laplace transform, starting from a finite set of noisy data. Several strategies are now available to tackle this inverse problem. In this work, we discuss how Backus–Gilbert methods, in particular the variation introduced by some of the authors, relate to the solution based on Gaussian Processes. Both methods allow computing spectral densities smearing with a kernel whose features depend on the detail of the algorithm. We will discuss such kernel, and show how Backus–Gilbert methods can be understood in a Bayesian fashion. As a consequence of this correspondence, we are able to interpret the algorithmic parameters of Backus–Gilbert methods as hyperparameters in the Bayesian language, which can be chosen by maximising a likelihood function. By performing a comparative study on lattice data, we show that, when both frameworks are set to compute the same quantity, the results are generally in agreement. Finally, we adopt a strategy to systematically validate both methodologies against pseudo-data, using covariance matrices measured from lattice simulations. In our setup, we find that the determination of the algorithmic parameters based on a stability analysis provides results that are, on average, more conservative than those based on the maximisation of a likelihood function.</p></div>","PeriodicalId":788,"journal":{"name":"The European Physical Journal C","volume":"85 2","pages":""},"PeriodicalIF":4.8000,"publicationDate":"2025-02-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://link.springer.com/content/pdf/10.1140/epjc/s10052-025-13885-9.pdf","citationCount":"0","resultStr":"{\"title\":\"Bayesian solution to the inverse problem and its relation to Backus–Gilbert methods\",\"authors\":\"Luigi Del Debbio,&nbsp;Alessandro Lupo,&nbsp;Marco Panero,&nbsp;Nazario Tantalo\",\"doi\":\"10.1140/epjc/s10052-025-13885-9\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>The problem of obtaining spectral densities from lattice data has been receiving great attention due to its importance in our understanding of scattering processes in Quantum Field Theory, with applications both in the Standard Model and beyond. The problem is notoriously difficult as it amounts to performing an inverse Laplace transform, starting from a finite set of noisy data. Several strategies are now available to tackle this inverse problem. In this work, we discuss how Backus–Gilbert methods, in particular the variation introduced by some of the authors, relate to the solution based on Gaussian Processes. Both methods allow computing spectral densities smearing with a kernel whose features depend on the detail of the algorithm. We will discuss such kernel, and show how Backus–Gilbert methods can be understood in a Bayesian fashion. As a consequence of this correspondence, we are able to interpret the algorithmic parameters of Backus–Gilbert methods as hyperparameters in the Bayesian language, which can be chosen by maximising a likelihood function. By performing a comparative study on lattice data, we show that, when both frameworks are set to compute the same quantity, the results are generally in agreement. Finally, we adopt a strategy to systematically validate both methodologies against pseudo-data, using covariance matrices measured from lattice simulations. In our setup, we find that the determination of the algorithmic parameters based on a stability analysis provides results that are, on average, more conservative than those based on the maximisation of a likelihood function.</p></div>\",\"PeriodicalId\":788,\"journal\":{\"name\":\"The European Physical Journal C\",\"volume\":\"85 2\",\"pages\":\"\"},\"PeriodicalIF\":4.8000,\"publicationDate\":\"2025-02-14\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://link.springer.com/content/pdf/10.1140/epjc/s10052-025-13885-9.pdf\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"The European Physical Journal C\",\"FirstCategoryId\":\"4\",\"ListUrlMain\":\"https://link.springer.com/article/10.1140/epjc/s10052-025-13885-9\",\"RegionNum\":2,\"RegionCategory\":\"物理与天体物理\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"PHYSICS, PARTICLES & FIELDS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"The European Physical Journal C","FirstCategoryId":"4","ListUrlMain":"https://link.springer.com/article/10.1140/epjc/s10052-025-13885-9","RegionNum":2,"RegionCategory":"物理与天体物理","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"PHYSICS, PARTICLES & FIELDS","Score":null,"Total":0}
引用次数: 0

摘要

从晶格数据中获得谱密度的问题一直受到人们的高度关注,因为它对我们理解量子场论中的散射过程很重要,在标准模型和其他领域都有应用。这个问题是出了名的困难,因为它相当于执行拉普拉斯逆变换,从一组有限的噪声数据开始。现在有几种策略可以解决这个相反的问题。在这项工作中,我们讨论了巴克斯-吉尔伯特方法,特别是一些作者引入的变异,是如何与基于高斯过程的解相关的。这两种方法都允许用核计算谱密度涂抹,核的特征取决于算法的细节。我们将讨论这样的内核,并展示如何以贝叶斯的方式理解Backus-Gilbert方法。由于这种对应关系,我们能够将Backus-Gilbert方法的算法参数解释为贝叶斯语言中的超参数,这可以通过最大化似然函数来选择。通过对晶格数据进行比较研究,我们表明,当两个框架设置为计算相同的数量时,结果通常是一致的。最后,我们采用了一种策略来系统地验证这两种方法对伪数据,使用从晶格模拟测量协方差矩阵。在我们的设置中,我们发现基于稳定性分析的算法参数的确定提供的结果平均而言比基于似然函数最大化的结果更保守。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Bayesian solution to the inverse problem and its relation to Backus–Gilbert methods

The problem of obtaining spectral densities from lattice data has been receiving great attention due to its importance in our understanding of scattering processes in Quantum Field Theory, with applications both in the Standard Model and beyond. The problem is notoriously difficult as it amounts to performing an inverse Laplace transform, starting from a finite set of noisy data. Several strategies are now available to tackle this inverse problem. In this work, we discuss how Backus–Gilbert methods, in particular the variation introduced by some of the authors, relate to the solution based on Gaussian Processes. Both methods allow computing spectral densities smearing with a kernel whose features depend on the detail of the algorithm. We will discuss such kernel, and show how Backus–Gilbert methods can be understood in a Bayesian fashion. As a consequence of this correspondence, we are able to interpret the algorithmic parameters of Backus–Gilbert methods as hyperparameters in the Bayesian language, which can be chosen by maximising a likelihood function. By performing a comparative study on lattice data, we show that, when both frameworks are set to compute the same quantity, the results are generally in agreement. Finally, we adopt a strategy to systematically validate both methodologies against pseudo-data, using covariance matrices measured from lattice simulations. In our setup, we find that the determination of the algorithmic parameters based on a stability analysis provides results that are, on average, more conservative than those based on the maximisation of a likelihood function.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
The European Physical Journal C
The European Physical Journal C 物理-物理:粒子与场物理
CiteScore
8.10
自引率
15.90%
发文量
1008
审稿时长
2-4 weeks
期刊介绍: Experimental Physics I: Accelerator Based High-Energy Physics Hadron and lepton collider physics Lepton-nucleon scattering High-energy nuclear reactions Standard model precision tests Search for new physics beyond the standard model Heavy flavour physics Neutrino properties Particle detector developments Computational methods and analysis tools Experimental Physics II: Astroparticle Physics Dark matter searches High-energy cosmic rays Double beta decay Long baseline neutrino experiments Neutrino astronomy Axions and other weakly interacting light particles Gravitational waves and observational cosmology Particle detector developments Computational methods and analysis tools Theoretical Physics I: Phenomenology of the Standard Model and Beyond Electroweak interactions Quantum chromo dynamics Heavy quark physics and quark flavour mixing Neutrino physics Phenomenology of astro- and cosmoparticle physics Meson spectroscopy and non-perturbative QCD Low-energy effective field theories Lattice field theory High temperature QCD and heavy ion physics Phenomenology of supersymmetric extensions of the SM Phenomenology of non-supersymmetric extensions of the SM Model building and alternative models of electroweak symmetry breaking Flavour physics beyond the SM Computational algorithms and tools...etc.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信