解除维数的诅咒:一种随机矩阵理论方法

T. Marzetta
{"title":"解除维数的诅咒:一种随机矩阵理论方法","authors":"T. Marzetta","doi":"10.1109/WIOPT.2009.5291553","DOIUrl":null,"url":null,"abstract":"The ubiquity of inexpensive sensors implies that we can measure vector-valued data of ever increasing dimension. But the number of independent measurements of the data vector is limited so the sample covariance matrix is usually singular. The traditional remedy for singularity is diagonal loading - the addition of a small identity matrix to make the covariance estimate invertible. An alternative to diagonal loading is to reduce the dimension of the data vectors to be smaller than the number of independent observations through an ensemble of isotropically random (Haar measure) unitary matrices. For every member of the unitary ensemble, the shortened data vectors yield a statistically meaningful, invertible covariance estimate from which we can compute an estimate for the ultimate desired quantity. The final step is to take the expectation of this estimate with respect to the unitary ensemble. For a class of applications that includes adaptive spectral estimation, the design of a linear estimator, and supervised learning the random matrix approach results in an estimate for the inverse covariance matrix which preserves the eigenvectors of the sample covariance matrix, but alters the eigenvalues in a nontrivial manner. A closed-form expression for the expectation over the unitary ensemble eludes us, but we have obtained a tractable asymptotic expression. Preliminary numerical results indicate considerable promise for this approach.","PeriodicalId":6630,"journal":{"name":"2017 15th International Symposium on Modeling and Optimization in Mobile, Ad Hoc, and Wireless Networks (WiOpt)","volume":"70 3 1","pages":"1-2"},"PeriodicalIF":0.0000,"publicationDate":"2009-06-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Lifting the curse of dimensionality: a random matrix-theoretic approach\",\"authors\":\"T. Marzetta\",\"doi\":\"10.1109/WIOPT.2009.5291553\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The ubiquity of inexpensive sensors implies that we can measure vector-valued data of ever increasing dimension. But the number of independent measurements of the data vector is limited so the sample covariance matrix is usually singular. The traditional remedy for singularity is diagonal loading - the addition of a small identity matrix to make the covariance estimate invertible. An alternative to diagonal loading is to reduce the dimension of the data vectors to be smaller than the number of independent observations through an ensemble of isotropically random (Haar measure) unitary matrices. For every member of the unitary ensemble, the shortened data vectors yield a statistically meaningful, invertible covariance estimate from which we can compute an estimate for the ultimate desired quantity. The final step is to take the expectation of this estimate with respect to the unitary ensemble. For a class of applications that includes adaptive spectral estimation, the design of a linear estimator, and supervised learning the random matrix approach results in an estimate for the inverse covariance matrix which preserves the eigenvectors of the sample covariance matrix, but alters the eigenvalues in a nontrivial manner. A closed-form expression for the expectation over the unitary ensemble eludes us, but we have obtained a tractable asymptotic expression. Preliminary numerical results indicate considerable promise for this approach.\",\"PeriodicalId\":6630,\"journal\":{\"name\":\"2017 15th International Symposium on Modeling and Optimization in Mobile, Ad Hoc, and Wireless Networks (WiOpt)\",\"volume\":\"70 3 1\",\"pages\":\"1-2\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2009-06-23\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2017 15th International Symposium on Modeling and Optimization in Mobile, Ad Hoc, and Wireless Networks (WiOpt)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/WIOPT.2009.5291553\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 15th International Symposium on Modeling and Optimization in Mobile, Ad Hoc, and Wireless Networks (WiOpt)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/WIOPT.2009.5291553","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

无处不在的廉价传感器意味着我们可以测量维度不断增加的向量值数据。但是数据向量的独立测量次数有限,所以样本协方差矩阵通常是奇异的。奇点的传统补救方法是对角加载——增加一个小单位矩阵使协方差估计可逆。对角线加载的另一种选择是通过各向同性随机(哈尔测度)酉矩阵的集合将数据向量的维数降低到小于独立观测的数量。对于统一集合的每个成员,缩短的数据向量产生统计上有意义的,可逆的协方差估计,我们可以从中计算出最终所需数量的估计。最后一步是对整体集合进行估计的期望。对于包括自适应谱估计、线性估计器的设计和监督学习在内的一类应用,随机矩阵方法可以对逆协方差矩阵进行估计,该方法保留了样本协方差矩阵的特征向量,但以非平凡的方式改变了特征值。对于幺正系综上的期望,我们没有一个封闭的表达式,但是我们得到了一个可处理的渐近表达式。初步的数值结果表明,这种方法很有前途。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Lifting the curse of dimensionality: a random matrix-theoretic approach
The ubiquity of inexpensive sensors implies that we can measure vector-valued data of ever increasing dimension. But the number of independent measurements of the data vector is limited so the sample covariance matrix is usually singular. The traditional remedy for singularity is diagonal loading - the addition of a small identity matrix to make the covariance estimate invertible. An alternative to diagonal loading is to reduce the dimension of the data vectors to be smaller than the number of independent observations through an ensemble of isotropically random (Haar measure) unitary matrices. For every member of the unitary ensemble, the shortened data vectors yield a statistically meaningful, invertible covariance estimate from which we can compute an estimate for the ultimate desired quantity. The final step is to take the expectation of this estimate with respect to the unitary ensemble. For a class of applications that includes adaptive spectral estimation, the design of a linear estimator, and supervised learning the random matrix approach results in an estimate for the inverse covariance matrix which preserves the eigenvectors of the sample covariance matrix, but alters the eigenvalues in a nontrivial manner. A closed-form expression for the expectation over the unitary ensemble eludes us, but we have obtained a tractable asymptotic expression. Preliminary numerical results indicate considerable promise for this approach.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信