广义最小二乘中先验协方差的调整

IF 1 4区 数学
W. Menke
{"title":"广义最小二乘中先验协方差的调整","authors":"W. Menke","doi":"10.4236/AM.2021.123011","DOIUrl":null,"url":null,"abstract":"Generalized Least Squares (least squares with prior information) requires the correct assignment of two prior covariance matrices: one associated with the uncertainty of measurements; the other with the uncertainty of prior information. These assignments often are very subjective, especially when correlations among data or among prior information are believed to occur. However, in cases in which the general form of these matrices can be anticipated up to a set of poorly-known parameters, the data and prior information may be used to better-determine (or “tune”) the parameters in a manner that is faithful to the underlying Bayesian foundation of GLS. We identify an objective function, the minimization of which leads to the best-estimate of the parameters and provide explicit and computationally-efficient formula for calculating the derivatives needed to implement the minimization with a gradient descent method. Furthermore, the problem is organized so that the minimization need be performed only over the space of covariance parameters, and not over the combined space of model and covariance parameters. We show that the use of trade-off curves to select the relative weight given to observations and prior information is not a form of tuning, because it does not, in general maximize the posterior probability of the model parameters, and can lead to a different weighting than the procedure described here. We also provide several examples that demonstrate the viability, and discuss both the advantages and limitations of the method.","PeriodicalId":55568,"journal":{"name":"Applied Mathematics-A Journal of Chinese Universities Series B","volume":null,"pages":null},"PeriodicalIF":1.0000,"publicationDate":"2021-03-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":"{\"title\":\"Tuning of Prior Covariance in Generalized Least Squares\",\"authors\":\"W. Menke\",\"doi\":\"10.4236/AM.2021.123011\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Generalized Least Squares (least squares with prior information) requires the correct assignment of two prior covariance matrices: one associated with the uncertainty of measurements; the other with the uncertainty of prior information. These assignments often are very subjective, especially when correlations among data or among prior information are believed to occur. However, in cases in which the general form of these matrices can be anticipated up to a set of poorly-known parameters, the data and prior information may be used to better-determine (or “tune”) the parameters in a manner that is faithful to the underlying Bayesian foundation of GLS. We identify an objective function, the minimization of which leads to the best-estimate of the parameters and provide explicit and computationally-efficient formula for calculating the derivatives needed to implement the minimization with a gradient descent method. Furthermore, the problem is organized so that the minimization need be performed only over the space of covariance parameters, and not over the combined space of model and covariance parameters. We show that the use of trade-off curves to select the relative weight given to observations and prior information is not a form of tuning, because it does not, in general maximize the posterior probability of the model parameters, and can lead to a different weighting than the procedure described here. We also provide several examples that demonstrate the viability, and discuss both the advantages and limitations of the method.\",\"PeriodicalId\":55568,\"journal\":{\"name\":\"Applied Mathematics-A Journal of Chinese Universities Series B\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":1.0000,\"publicationDate\":\"2021-03-11\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"3\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Applied Mathematics-A Journal of Chinese Universities Series B\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://doi.org/10.4236/AM.2021.123011\",\"RegionNum\":4,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Applied Mathematics-A Journal of Chinese Universities Series B","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.4236/AM.2021.123011","RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3

摘要

广义最小二乘(具有先验信息的最小二乘)要求两个先验协方差矩阵的正确分配:一个与测量的不确定性相关;另一种具有先验信息的不确定性。这些分配通常是非常主观的,特别是当数据之间或先前信息之间的相关性被认为发生时。然而,如果这些矩阵的一般形式可以预测到一组鲜为人知的参数,则可以使用数据和先验信息以一种忠实于GLS的底层贝叶斯基础的方式更好地确定(或“调整”)参数。我们确定了一个目标函数,其最小化导致参数的最佳估计,并提供了明确的和计算效率高的公式,用于计算用梯度下降法实现最小化所需的导数。此外,该问题的组织使得只需要在协方差参数空间上进行最小化,而不需要在模型和协方差参数的组合空间上进行最小化。我们表明,使用权衡曲线来选择给予观察值和先验信息的相对权重不是一种调整形式,因为它通常不会最大化模型参数的后验概率,并且可能导致与这里描述的过程不同的权重。我们还提供了几个例子来证明该方法的可行性,并讨论了该方法的优点和局限性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Tuning of Prior Covariance in Generalized Least Squares
Generalized Least Squares (least squares with prior information) requires the correct assignment of two prior covariance matrices: one associated with the uncertainty of measurements; the other with the uncertainty of prior information. These assignments often are very subjective, especially when correlations among data or among prior information are believed to occur. However, in cases in which the general form of these matrices can be anticipated up to a set of poorly-known parameters, the data and prior information may be used to better-determine (or “tune”) the parameters in a manner that is faithful to the underlying Bayesian foundation of GLS. We identify an objective function, the minimization of which leads to the best-estimate of the parameters and provide explicit and computationally-efficient formula for calculating the derivatives needed to implement the minimization with a gradient descent method. Furthermore, the problem is organized so that the minimization need be performed only over the space of covariance parameters, and not over the combined space of model and covariance parameters. We show that the use of trade-off curves to select the relative weight given to observations and prior information is not a form of tuning, because it does not, in general maximize the posterior probability of the model parameters, and can lead to a different weighting than the procedure described here. We also provide several examples that demonstrate the viability, and discuss both the advantages and limitations of the method.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
10.00%
发文量
33
期刊介绍: Applied Mathematics promotes the integration of mathematics with other scientific disciplines, expanding its fields of study and promoting the development of relevant interdisciplinary subjects. The journal mainly publishes original research papers that apply mathematical concepts, theories and methods to other subjects such as physics, chemistry, biology, information science, energy, environmental science, economics, and finance. In addition, it also reports the latest developments and trends in which mathematics interacts with other disciplines. Readers include professors and students, professionals in applied mathematics, and engineers at research institutes and in industry. Applied Mathematics - A Journal of Chinese Universities has been an English-language quarterly since 1993. The English edition, abbreviated as Series B, has different contents than this Chinese edition, Series A.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信