{"title":"近似稀疏信号的压缩感知","authors":"M. Stojnic, Weiyu Xu, B. Hassibi","doi":"10.1109/ISIT.2008.4595377","DOIUrl":null,"url":null,"abstract":"It is well known that compressed sensing problems reduce to solving large under-determined systems of equations. If we choose the compressed measurement matrix according to some appropriate distribution and the signal is sparse enough the l1 optimization can exactly recover the ideally sparse signal with overwhelming probability by Candes, E. and Tao, T., [2], [1]. In the current paper, we will consider the case of the so-called approximately sparse signals. These signals are a generalized version of the ideally sparse signals. Letting the zero valued components of the ideally sparse signals to take the values of certain small magnitude one can construct the approximately sparse signals. Using a different but simple proof technique we show that the claims similar to those of [2] and [1] related to the proportionality of the number of large components of the signals to the number of measurements, hold for approximately sparse signals as well. Furthermore, using the same technique we compute the explicit values of what this proportionality can be if the compressed measurement matrix A has a rotationally invariant distribution of the null-space. We also give the quantitative tradeoff between the signal sparsity and the recovery robustness of the l1 minimization. As it will turn out in an asymptotic case of the number of measurements the threshold result of [1] corresponds to a special case of our result.","PeriodicalId":194674,"journal":{"name":"2008 IEEE International Symposium on Information Theory","volume":"69 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2008-07-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"30","resultStr":"{\"title\":\"Compressed sensing of approximately sparse signals\",\"authors\":\"M. Stojnic, Weiyu Xu, B. Hassibi\",\"doi\":\"10.1109/ISIT.2008.4595377\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"It is well known that compressed sensing problems reduce to solving large under-determined systems of equations. If we choose the compressed measurement matrix according to some appropriate distribution and the signal is sparse enough the l1 optimization can exactly recover the ideally sparse signal with overwhelming probability by Candes, E. and Tao, T., [2], [1]. In the current paper, we will consider the case of the so-called approximately sparse signals. These signals are a generalized version of the ideally sparse signals. Letting the zero valued components of the ideally sparse signals to take the values of certain small magnitude one can construct the approximately sparse signals. Using a different but simple proof technique we show that the claims similar to those of [2] and [1] related to the proportionality of the number of large components of the signals to the number of measurements, hold for approximately sparse signals as well. Furthermore, using the same technique we compute the explicit values of what this proportionality can be if the compressed measurement matrix A has a rotationally invariant distribution of the null-space. We also give the quantitative tradeoff between the signal sparsity and the recovery robustness of the l1 minimization. As it will turn out in an asymptotic case of the number of measurements the threshold result of [1] corresponds to a special case of our result.\",\"PeriodicalId\":194674,\"journal\":{\"name\":\"2008 IEEE International Symposium on Information Theory\",\"volume\":\"69 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2008-07-06\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"30\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2008 IEEE International Symposium on Information Theory\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ISIT.2008.4595377\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2008 IEEE International Symposium on Information Theory","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISIT.2008.4595377","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Compressed sensing of approximately sparse signals
It is well known that compressed sensing problems reduce to solving large under-determined systems of equations. If we choose the compressed measurement matrix according to some appropriate distribution and the signal is sparse enough the l1 optimization can exactly recover the ideally sparse signal with overwhelming probability by Candes, E. and Tao, T., [2], [1]. In the current paper, we will consider the case of the so-called approximately sparse signals. These signals are a generalized version of the ideally sparse signals. Letting the zero valued components of the ideally sparse signals to take the values of certain small magnitude one can construct the approximately sparse signals. Using a different but simple proof technique we show that the claims similar to those of [2] and [1] related to the proportionality of the number of large components of the signals to the number of measurements, hold for approximately sparse signals as well. Furthermore, using the same technique we compute the explicit values of what this proportionality can be if the compressed measurement matrix A has a rotationally invariant distribution of the null-space. We also give the quantitative tradeoff between the signal sparsity and the recovery robustness of the l1 minimization. As it will turn out in an asymptotic case of the number of measurements the threshold result of [1] corresponds to a special case of our result.