{"title":"基于亚线性稀疏压缩感知的广义近似消息传递","authors":"Keigo Takeuchi","doi":"10.1109/TIT.2025.3560070","DOIUrl":null,"url":null,"abstract":"This paper addresses the reconstruction of an unknown signal vector with sublinear sparsity from generalized linear measurements. Generalized approximate message-passing (GAMP) is proposed via state evolution in the sublinear sparsity limit, where the signal dimension <italic>N</i>, measurement dimension <italic>M</i>, and signal sparsity <italic>k</i> satisfy <inline-formula> <tex-math>$\\log k/\\log N\\to \\gamma \\in [0, 1$ </tex-math></inline-formula>) and <inline-formula> <tex-math>$M/\\{k\\log (N/k)\\}\\to \\delta $ </tex-math></inline-formula> as <italic>N</i> and <italic>k</i> tend to infinity. While the overall flow in state evolution is the same as that for linear sparsity, each proof step for inner denoising requires stronger assumptions than those for linear sparsity. The required new assumptions are proved for Bayesian inner denoising. When Bayesian outer and inner denoisers are used in GAMP, the obtained state evolution recursion is utilized to evaluate the prefactor <inline-formula> <tex-math>$\\delta $ </tex-math></inline-formula> in the sample complexity, called reconstruction threshold. If and only if <inline-formula> <tex-math>$\\delta $ </tex-math></inline-formula> is larger than the reconstruction threshold, Bayesian GAMP can achieve asymptotically exact signal reconstruction. In particular, the reconstruction threshold is finite for noisy linear measurements when the support of non-zero signal elements does not include a neighborhood of zero. As numerical examples, this paper considers linear measurements and 1-bit compressed sensing. Numerical simulations for both cases show that Bayesian GAMP outperforms existing algorithms for sublinear sparsity in terms of the sample complexity.","PeriodicalId":13494,"journal":{"name":"IEEE Transactions on Information Theory","volume":"71 6","pages":"4602-4636"},"PeriodicalIF":2.2000,"publicationDate":"2025-04-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10963836","citationCount":"0","resultStr":"{\"title\":\"Generalized Approximate Message-Passing for Compressed Sensing With Sublinear Sparsity\",\"authors\":\"Keigo Takeuchi\",\"doi\":\"10.1109/TIT.2025.3560070\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This paper addresses the reconstruction of an unknown signal vector with sublinear sparsity from generalized linear measurements. Generalized approximate message-passing (GAMP) is proposed via state evolution in the sublinear sparsity limit, where the signal dimension <italic>N</i>, measurement dimension <italic>M</i>, and signal sparsity <italic>k</i> satisfy <inline-formula> <tex-math>$\\\\log k/\\\\log N\\\\to \\\\gamma \\\\in [0, 1$ </tex-math></inline-formula>) and <inline-formula> <tex-math>$M/\\\\{k\\\\log (N/k)\\\\}\\\\to \\\\delta $ </tex-math></inline-formula> as <italic>N</i> and <italic>k</i> tend to infinity. While the overall flow in state evolution is the same as that for linear sparsity, each proof step for inner denoising requires stronger assumptions than those for linear sparsity. The required new assumptions are proved for Bayesian inner denoising. When Bayesian outer and inner denoisers are used in GAMP, the obtained state evolution recursion is utilized to evaluate the prefactor <inline-formula> <tex-math>$\\\\delta $ </tex-math></inline-formula> in the sample complexity, called reconstruction threshold. If and only if <inline-formula> <tex-math>$\\\\delta $ </tex-math></inline-formula> is larger than the reconstruction threshold, Bayesian GAMP can achieve asymptotically exact signal reconstruction. In particular, the reconstruction threshold is finite for noisy linear measurements when the support of non-zero signal elements does not include a neighborhood of zero. As numerical examples, this paper considers linear measurements and 1-bit compressed sensing. Numerical simulations for both cases show that Bayesian GAMP outperforms existing algorithms for sublinear sparsity in terms of the sample complexity.\",\"PeriodicalId\":13494,\"journal\":{\"name\":\"IEEE Transactions on Information Theory\",\"volume\":\"71 6\",\"pages\":\"4602-4636\"},\"PeriodicalIF\":2.2000,\"publicationDate\":\"2025-04-11\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10963836\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Transactions on Information Theory\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10963836/\",\"RegionNum\":3,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"COMPUTER SCIENCE, INFORMATION SYSTEMS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Information Theory","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10963836/","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
Generalized Approximate Message-Passing for Compressed Sensing With Sublinear Sparsity
This paper addresses the reconstruction of an unknown signal vector with sublinear sparsity from generalized linear measurements. Generalized approximate message-passing (GAMP) is proposed via state evolution in the sublinear sparsity limit, where the signal dimension N, measurement dimension M, and signal sparsity k satisfy $\log k/\log N\to \gamma \in [0, 1$ ) and $M/\{k\log (N/k)\}\to \delta $ as N and k tend to infinity. While the overall flow in state evolution is the same as that for linear sparsity, each proof step for inner denoising requires stronger assumptions than those for linear sparsity. The required new assumptions are proved for Bayesian inner denoising. When Bayesian outer and inner denoisers are used in GAMP, the obtained state evolution recursion is utilized to evaluate the prefactor $\delta $ in the sample complexity, called reconstruction threshold. If and only if $\delta $ is larger than the reconstruction threshold, Bayesian GAMP can achieve asymptotically exact signal reconstruction. In particular, the reconstruction threshold is finite for noisy linear measurements when the support of non-zero signal elements does not include a neighborhood of zero. As numerical examples, this paper considers linear measurements and 1-bit compressed sensing. Numerical simulations for both cases show that Bayesian GAMP outperforms existing algorithms for sublinear sparsity in terms of the sample complexity.
期刊介绍:
The IEEE Transactions on Information Theory is a journal that publishes theoretical and experimental papers concerned with the transmission, processing, and utilization of information. The boundaries of acceptable subject matter are intentionally not sharply delimited. Rather, it is hoped that as the focus of research activity changes, a flexible policy will permit this Transactions to follow suit. Current appropriate topics are best reflected by recent Tables of Contents; they are summarized in the titles of editorial areas that appear on the inside front cover.