在约束条件下估计有条件对称正定矩阵的数值算法

IF 1.8 3区 数学 Q1 MATHEMATICS
Oren E. Livne, Katherine E. Castellano, Dan F. McCaffrey
{"title":"在约束条件下估计有条件对称正定矩阵的数值算法","authors":"Oren E. Livne, Katherine E. Castellano, Dan F. McCaffrey","doi":"10.1002/nla.2559","DOIUrl":null,"url":null,"abstract":"SummaryWe present RCO (regularized Cholesky optimization): a numerical algorithm for finding a symmetric positive definite (PD) matrix with a bounded condition number that minimizes an objective function. This task arises when estimating a covariance matrix from noisy data or due to model constraints, which can cause spurious small negative eigenvalues. A special case is the problem of finding the nearest well‐conditioned PD matrix to a given matrix. RCO explicitly optimizes the entries of the Cholesky factor. This requires solving a regularized non‐linear, non‐convex optimization problem, for which we apply Newton‐CG and exploit the Hessian's sparsity. The regularization parameter is determined via numerical continuation with an accuracy‐conditioning trade‐off criterion. We apply RCO to our motivating educational measurement application of estimating the covariance matrix of an empirical best linear prediction (EBLP) of school growth scores. We present numerical results for two empirical datasets, state and urban. RCO outperforms general‐purpose near‐PD algorithms, obtaining ‐smaller matrix reconstruction bias and smaller EBLP estimator mean‐squared error. It is in fact the only algorithm that solves the right minimization problem, which strikes a balance between the objective function and the condition number. RCO can be similarly applied to the stable estimation of other posterior means. For the task of finding the nearest PD matrix, RCO yields similar condition numbers to near‐PD methods, but provides a better overall near‐null space.","PeriodicalId":49731,"journal":{"name":"Numerical Linear Algebra with Applications","volume":null,"pages":null},"PeriodicalIF":1.8000,"publicationDate":"2024-05-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Numerical algorithm for estimating a conditioned symmetric positive definite matrix under constraints\",\"authors\":\"Oren E. Livne, Katherine E. Castellano, Dan F. McCaffrey\",\"doi\":\"10.1002/nla.2559\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"SummaryWe present RCO (regularized Cholesky optimization): a numerical algorithm for finding a symmetric positive definite (PD) matrix with a bounded condition number that minimizes an objective function. This task arises when estimating a covariance matrix from noisy data or due to model constraints, which can cause spurious small negative eigenvalues. A special case is the problem of finding the nearest well‐conditioned PD matrix to a given matrix. RCO explicitly optimizes the entries of the Cholesky factor. This requires solving a regularized non‐linear, non‐convex optimization problem, for which we apply Newton‐CG and exploit the Hessian's sparsity. The regularization parameter is determined via numerical continuation with an accuracy‐conditioning trade‐off criterion. We apply RCO to our motivating educational measurement application of estimating the covariance matrix of an empirical best linear prediction (EBLP) of school growth scores. We present numerical results for two empirical datasets, state and urban. RCO outperforms general‐purpose near‐PD algorithms, obtaining ‐smaller matrix reconstruction bias and smaller EBLP estimator mean‐squared error. It is in fact the only algorithm that solves the right minimization problem, which strikes a balance between the objective function and the condition number. RCO can be similarly applied to the stable estimation of other posterior means. For the task of finding the nearest PD matrix, RCO yields similar condition numbers to near‐PD methods, but provides a better overall near‐null space.\",\"PeriodicalId\":49731,\"journal\":{\"name\":\"Numerical Linear Algebra with Applications\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":1.8000,\"publicationDate\":\"2024-05-06\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Numerical Linear Algebra with Applications\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://doi.org/10.1002/nla.2559\",\"RegionNum\":3,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"MATHEMATICS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Numerical Linear Algebra with Applications","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1002/nla.2559","RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MATHEMATICS","Score":null,"Total":0}
引用次数: 0

摘要

摘要我们提出了 RCO(正则化 Cholesky 优化):一种数值算法,用于寻找具有有界条件数的对称正定(PD)矩阵,使目标函数最小化。当从噪声数据或模型约束中估计协方差矩阵时,会出现这一任务,因为模型约束会导致虚假的小负特征值。一个特例是寻找与给定矩阵最接近的条件良好的 PD 矩阵的问题。RCO 明确优化了 Cholesky 因子的条目。这需要求解一个正则化的非线性、非凸优化问题,为此我们应用了牛顿-CG 并利用了 Hessian 的稀疏性。正则化参数是通过数值延续和精度条件权衡标准确定的。我们将 RCO 应用于教育测量的激励性应用,即估计学校成长分数的经验最佳线性预测 (EBLP) 的协方差矩阵。我们展示了州和城市两个经验数据集的数值结果。RCO 优于通用的近线性预测算法,获得了更小的矩阵重构偏差和更小的 EBLP 估计均方误差。事实上,它是唯一能解决正确最小化问题的算法,在目标函数和条件数之间取得了平衡。RCO 同样可以应用于其他后验均值的稳定估计。对于寻找最接近 PD 矩阵的任务,RCO 得到的条件数与近 PD 方法相似,但能提供更好的整体近空空间。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Numerical algorithm for estimating a conditioned symmetric positive definite matrix under constraints
SummaryWe present RCO (regularized Cholesky optimization): a numerical algorithm for finding a symmetric positive definite (PD) matrix with a bounded condition number that minimizes an objective function. This task arises when estimating a covariance matrix from noisy data or due to model constraints, which can cause spurious small negative eigenvalues. A special case is the problem of finding the nearest well‐conditioned PD matrix to a given matrix. RCO explicitly optimizes the entries of the Cholesky factor. This requires solving a regularized non‐linear, non‐convex optimization problem, for which we apply Newton‐CG and exploit the Hessian's sparsity. The regularization parameter is determined via numerical continuation with an accuracy‐conditioning trade‐off criterion. We apply RCO to our motivating educational measurement application of estimating the covariance matrix of an empirical best linear prediction (EBLP) of school growth scores. We present numerical results for two empirical datasets, state and urban. RCO outperforms general‐purpose near‐PD algorithms, obtaining ‐smaller matrix reconstruction bias and smaller EBLP estimator mean‐squared error. It is in fact the only algorithm that solves the right minimization problem, which strikes a balance between the objective function and the condition number. RCO can be similarly applied to the stable estimation of other posterior means. For the task of finding the nearest PD matrix, RCO yields similar condition numbers to near‐PD methods, but provides a better overall near‐null space.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
3.40
自引率
2.30%
发文量
50
审稿时长
12 months
期刊介绍: Manuscripts submitted to Numerical Linear Algebra with Applications should include large-scale broad-interest applications in which challenging computational results are integral to the approach investigated and analysed. Manuscripts that, in the Editor’s view, do not satisfy these conditions will not be accepted for review. Numerical Linear Algebra with Applications receives submissions in areas that address developing, analysing and applying linear algebra algorithms for solving problems arising in multilinear (tensor) algebra, in statistics, such as Markov Chains, as well as in deterministic and stochastic modelling of large-scale networks, algorithm development, performance analysis or related computational aspects. Topics covered include: Standard and Generalized Conjugate Gradients, Multigrid and Other Iterative Methods; Preconditioning Methods; Direct Solution Methods; Numerical Methods for Eigenproblems; Newton-like Methods for Nonlinear Equations; Parallel and Vectorizable Algorithms in Numerical Linear Algebra; Application of Methods of Numerical Linear Algebra in Science, Engineering and Economics.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信