{"title":"基于协作学习字典的压缩感知","authors":"Kai Guo, Xijun Liang, Weizhi Lu","doi":"10.1109/ISPA52656.2021.9552065","DOIUrl":null,"url":null,"abstract":"In compressed sensing, the recovery error of a high dimensional signal can be approximately modeled by a multivariate Gaussian distribution N (µ, σ2I). The mean vector µ has its zero and nonzero elements correspond respectively to small dense errors caused by system noise, and large sparse errors caused by discarding relatively small coefficients in sparse recovery. To suppress small errors with zero mean, one major solution is to average the recovery results of multiple dictionaries. This will linearly decrease the error's variance σ2, and then enable the error taking zero value with high probability. Unfortunately, the averaging method cannot promise to decrease large errors with nonzero means. Moreover, in practice, large errors of distinct dictionaries tend to occur at the same coordinates with the same value signs, because the dictionaries learned independently tend to converge to the points close to each other and thus yield similar large errors in sparse recovery. This property prevents large errors from being decreased by average. In the paper, we prove that the average performance could be improved, if large errors of distinct dictionaries have disjoint supports. To obtain such dictionaries, we propose a collaborative dictionary learning model, which is implemented with a block coordinate decent method. The resulting dictionaries present desired experimental performance. A full version of the paper is accessible at https://drive.google.com/file/d/1_wy455PuKit1yf6QmXJxt81Y-ZZ5gq0s/view?usp=sharing","PeriodicalId":131088,"journal":{"name":"2021 12th International Symposium on Image and Signal Processing and Analysis (ISPA)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-09-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Compressed Sensing via Collaboratively Learned Dictionaries\",\"authors\":\"Kai Guo, Xijun Liang, Weizhi Lu\",\"doi\":\"10.1109/ISPA52656.2021.9552065\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In compressed sensing, the recovery error of a high dimensional signal can be approximately modeled by a multivariate Gaussian distribution N (µ, σ2I). The mean vector µ has its zero and nonzero elements correspond respectively to small dense errors caused by system noise, and large sparse errors caused by discarding relatively small coefficients in sparse recovery. To suppress small errors with zero mean, one major solution is to average the recovery results of multiple dictionaries. This will linearly decrease the error's variance σ2, and then enable the error taking zero value with high probability. Unfortunately, the averaging method cannot promise to decrease large errors with nonzero means. Moreover, in practice, large errors of distinct dictionaries tend to occur at the same coordinates with the same value signs, because the dictionaries learned independently tend to converge to the points close to each other and thus yield similar large errors in sparse recovery. This property prevents large errors from being decreased by average. In the paper, we prove that the average performance could be improved, if large errors of distinct dictionaries have disjoint supports. To obtain such dictionaries, we propose a collaborative dictionary learning model, which is implemented with a block coordinate decent method. The resulting dictionaries present desired experimental performance. A full version of the paper is accessible at https://drive.google.com/file/d/1_wy455PuKit1yf6QmXJxt81Y-ZZ5gq0s/view?usp=sharing\",\"PeriodicalId\":131088,\"journal\":{\"name\":\"2021 12th International Symposium on Image and Signal Processing and Analysis (ISPA)\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-09-13\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 12th International Symposium on Image and Signal Processing and Analysis (ISPA)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ISPA52656.2021.9552065\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 12th International Symposium on Image and Signal Processing and Analysis (ISPA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISPA52656.2021.9552065","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Compressed Sensing via Collaboratively Learned Dictionaries
In compressed sensing, the recovery error of a high dimensional signal can be approximately modeled by a multivariate Gaussian distribution N (µ, σ2I). The mean vector µ has its zero and nonzero elements correspond respectively to small dense errors caused by system noise, and large sparse errors caused by discarding relatively small coefficients in sparse recovery. To suppress small errors with zero mean, one major solution is to average the recovery results of multiple dictionaries. This will linearly decrease the error's variance σ2, and then enable the error taking zero value with high probability. Unfortunately, the averaging method cannot promise to decrease large errors with nonzero means. Moreover, in practice, large errors of distinct dictionaries tend to occur at the same coordinates with the same value signs, because the dictionaries learned independently tend to converge to the points close to each other and thus yield similar large errors in sparse recovery. This property prevents large errors from being decreased by average. In the paper, we prove that the average performance could be improved, if large errors of distinct dictionaries have disjoint supports. To obtain such dictionaries, we propose a collaborative dictionary learning model, which is implemented with a block coordinate decent method. The resulting dictionaries present desired experimental performance. A full version of the paper is accessible at https://drive.google.com/file/d/1_wy455PuKit1yf6QmXJxt81Y-ZZ5gq0s/view?usp=sharing