基于协作学习字典的压缩感知

Kai Guo, Xijun Liang, Weizhi Lu
{"title":"基于协作学习字典的压缩感知","authors":"Kai Guo, Xijun Liang, Weizhi Lu","doi":"10.1109/ISPA52656.2021.9552065","DOIUrl":null,"url":null,"abstract":"In compressed sensing, the recovery error of a high dimensional signal can be approximately modeled by a multivariate Gaussian distribution N (µ, σ2I). The mean vector µ has its zero and nonzero elements correspond respectively to small dense errors caused by system noise, and large sparse errors caused by discarding relatively small coefficients in sparse recovery. To suppress small errors with zero mean, one major solution is to average the recovery results of multiple dictionaries. This will linearly decrease the error's variance σ2, and then enable the error taking zero value with high probability. Unfortunately, the averaging method cannot promise to decrease large errors with nonzero means. Moreover, in practice, large errors of distinct dictionaries tend to occur at the same coordinates with the same value signs, because the dictionaries learned independently tend to converge to the points close to each other and thus yield similar large errors in sparse recovery. This property prevents large errors from being decreased by average. In the paper, we prove that the average performance could be improved, if large errors of distinct dictionaries have disjoint supports. To obtain such dictionaries, we propose a collaborative dictionary learning model, which is implemented with a block coordinate decent method. The resulting dictionaries present desired experimental performance. A full version of the paper is accessible at https://drive.google.com/file/d/1_wy455PuKit1yf6QmXJxt81Y-ZZ5gq0s/view?usp=sharing","PeriodicalId":131088,"journal":{"name":"2021 12th International Symposium on Image and Signal Processing and Analysis (ISPA)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-09-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Compressed Sensing via Collaboratively Learned Dictionaries\",\"authors\":\"Kai Guo, Xijun Liang, Weizhi Lu\",\"doi\":\"10.1109/ISPA52656.2021.9552065\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In compressed sensing, the recovery error of a high dimensional signal can be approximately modeled by a multivariate Gaussian distribution N (µ, σ2I). The mean vector µ has its zero and nonzero elements correspond respectively to small dense errors caused by system noise, and large sparse errors caused by discarding relatively small coefficients in sparse recovery. To suppress small errors with zero mean, one major solution is to average the recovery results of multiple dictionaries. This will linearly decrease the error's variance σ2, and then enable the error taking zero value with high probability. Unfortunately, the averaging method cannot promise to decrease large errors with nonzero means. Moreover, in practice, large errors of distinct dictionaries tend to occur at the same coordinates with the same value signs, because the dictionaries learned independently tend to converge to the points close to each other and thus yield similar large errors in sparse recovery. This property prevents large errors from being decreased by average. In the paper, we prove that the average performance could be improved, if large errors of distinct dictionaries have disjoint supports. To obtain such dictionaries, we propose a collaborative dictionary learning model, which is implemented with a block coordinate decent method. The resulting dictionaries present desired experimental performance. A full version of the paper is accessible at https://drive.google.com/file/d/1_wy455PuKit1yf6QmXJxt81Y-ZZ5gq0s/view?usp=sharing\",\"PeriodicalId\":131088,\"journal\":{\"name\":\"2021 12th International Symposium on Image and Signal Processing and Analysis (ISPA)\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-09-13\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 12th International Symposium on Image and Signal Processing and Analysis (ISPA)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ISPA52656.2021.9552065\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 12th International Symposium on Image and Signal Processing and Analysis (ISPA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISPA52656.2021.9552065","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

在压缩感知中,高维信号的恢复误差可以近似地用多元高斯分布N(µ,σ2I)来表示。平均向量µ的零元素和非零元素分别对应于系统噪声引起的小的密集误差和稀疏恢复中丢弃相对小的系数引起的大的稀疏误差。为了抑制平均为零的小错误,一个主要的解决方案是对多个字典的恢复结果进行平均。这将线性减小误差方差σ2,从而使误差大概率取零。不幸的是,平均方法不能保证减少非零均值的大误差。此外,在实际操作中,不同字典的大误差往往出现在相同的值符号的同一坐标上,因为独立学习的字典往往会收敛到彼此靠近的点,从而在稀疏恢复中产生类似的大误差。这个特性防止大的误差按平均值减少。在本文中,我们证明了如果不同字典的大错误具有不相交的支持,则平均性能可以得到提高。为了获得这样的字典,我们提出了一种协作字典学习模型,该模型采用块坐标体面法实现。所得到的字典具有理想的实验性能。该论文的完整版本可在https://drive.google.com/file/d/1_wy455PuKit1yf6QmXJxt81Y-ZZ5gq0s/view?usp=sharing上获得
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Compressed Sensing via Collaboratively Learned Dictionaries
In compressed sensing, the recovery error of a high dimensional signal can be approximately modeled by a multivariate Gaussian distribution N (µ, σ2I). The mean vector µ has its zero and nonzero elements correspond respectively to small dense errors caused by system noise, and large sparse errors caused by discarding relatively small coefficients in sparse recovery. To suppress small errors with zero mean, one major solution is to average the recovery results of multiple dictionaries. This will linearly decrease the error's variance σ2, and then enable the error taking zero value with high probability. Unfortunately, the averaging method cannot promise to decrease large errors with nonzero means. Moreover, in practice, large errors of distinct dictionaries tend to occur at the same coordinates with the same value signs, because the dictionaries learned independently tend to converge to the points close to each other and thus yield similar large errors in sparse recovery. This property prevents large errors from being decreased by average. In the paper, we prove that the average performance could be improved, if large errors of distinct dictionaries have disjoint supports. To obtain such dictionaries, we propose a collaborative dictionary learning model, which is implemented with a block coordinate decent method. The resulting dictionaries present desired experimental performance. A full version of the paper is accessible at https://drive.google.com/file/d/1_wy455PuKit1yf6QmXJxt81Y-ZZ5gq0s/view?usp=sharing
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信