{"title":"通过𝓁1-minimization获取平滑度类的采样数","authors":"Thomas Jahn, T. Ullrich, Felix Voigtländer","doi":"10.48550/arXiv.2212.00445","DOIUrl":null,"url":null,"abstract":"Using techniques developed recently in the field of compressed sensing we prove new upper bounds for general (nonlinear) sampling numbers of (quasi-)Banach smoothness spaces in $L^2$. In particular, we show that in relevant cases such as mixed and isotropic weighted Wiener classes or Sobolev spaces with mixed smoothness, sampling numbers in $L^2$ can be upper bounded by best $n$-term trigonometric widths in $L^\\infty$. We describe a recovery procedure from $m$ function values based on $\\ell^1$-minimization (basis pursuit denoising). With this method, a significant gain in the rate of convergence compared to recently developed linear recovery methods is achieved. In this deterministic worst-case setting we see an additional speed-up of $m^{-1/2}$ (up to log factors) compared to linear methods in case of weighted Wiener spaces. For their quasi-Banach counterparts even arbitrary polynomial speed-up is possible. Surprisingly, our approach allows to recover mixed smoothness Sobolev functions belonging to $S^r_pW(\\mathbb{T}^d)$ on the $d$-torus with a logarithmically better rate of convergence than any linear method can achieve when $1","PeriodicalId":15442,"journal":{"name":"Journal of complex networks","volume":"21 1","pages":"101786"},"PeriodicalIF":2.2000,"publicationDate":"2022-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"8","resultStr":"{\"title\":\"Sampling numbers of smoothness classes via 𝓁1-minimization\",\"authors\":\"Thomas Jahn, T. Ullrich, Felix Voigtländer\",\"doi\":\"10.48550/arXiv.2212.00445\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Using techniques developed recently in the field of compressed sensing we prove new upper bounds for general (nonlinear) sampling numbers of (quasi-)Banach smoothness spaces in $L^2$. In particular, we show that in relevant cases such as mixed and isotropic weighted Wiener classes or Sobolev spaces with mixed smoothness, sampling numbers in $L^2$ can be upper bounded by best $n$-term trigonometric widths in $L^\\\\infty$. We describe a recovery procedure from $m$ function values based on $\\\\ell^1$-minimization (basis pursuit denoising). With this method, a significant gain in the rate of convergence compared to recently developed linear recovery methods is achieved. In this deterministic worst-case setting we see an additional speed-up of $m^{-1/2}$ (up to log factors) compared to linear methods in case of weighted Wiener spaces. For their quasi-Banach counterparts even arbitrary polynomial speed-up is possible. Surprisingly, our approach allows to recover mixed smoothness Sobolev functions belonging to $S^r_pW(\\\\mathbb{T}^d)$ on the $d$-torus with a logarithmically better rate of convergence than any linear method can achieve when $1\",\"PeriodicalId\":15442,\"journal\":{\"name\":\"Journal of complex networks\",\"volume\":\"21 1\",\"pages\":\"101786\"},\"PeriodicalIF\":2.2000,\"publicationDate\":\"2022-12-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"8\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of complex networks\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://doi.org/10.48550/arXiv.2212.00445\",\"RegionNum\":4,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"MATHEMATICS, INTERDISCIPLINARY APPLICATIONS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of complex networks","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.48550/arXiv.2212.00445","RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"MATHEMATICS, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
引用次数: 8
摘要
利用压缩感知领域最新发展的技术,我们证明了$L^2$中(拟-)Banach平滑空间的一般(非线性)采样数的新上界。特别地,我们证明了在相关的情况下,如混合和各向同性加权Wiener类或具有混合平滑性的Sobolev空间中,$L^2$中的采样数可以被$L^ inty $中的最佳$n$项三角宽度的上界。我们描述了基于$\ well ^1$最小化(基追求去噪)的$m$函数值的恢复过程。与最近开发的线性恢复方法相比,这种方法的收敛速度有了显著的提高。在这种确定的最坏情况设置中,我们看到与加权Wiener空间的线性方法相比,$m^{-1/2}$(高达对数因子)的额外加速。对于它们的拟巴拿赫对应物,甚至任意多项式加速是可能的。令人惊讶的是,我们的方法允许在$d$-环面上恢复属于$S^r_pW(\mathbb{T}^d)$的混合平滑Sobolev函数,其收敛速度比任何线性方法在$1时都要高
Sampling numbers of smoothness classes via 𝓁1-minimization
Using techniques developed recently in the field of compressed sensing we prove new upper bounds for general (nonlinear) sampling numbers of (quasi-)Banach smoothness spaces in $L^2$. In particular, we show that in relevant cases such as mixed and isotropic weighted Wiener classes or Sobolev spaces with mixed smoothness, sampling numbers in $L^2$ can be upper bounded by best $n$-term trigonometric widths in $L^\infty$. We describe a recovery procedure from $m$ function values based on $\ell^1$-minimization (basis pursuit denoising). With this method, a significant gain in the rate of convergence compared to recently developed linear recovery methods is achieved. In this deterministic worst-case setting we see an additional speed-up of $m^{-1/2}$ (up to log factors) compared to linear methods in case of weighted Wiener spaces. For their quasi-Banach counterparts even arbitrary polynomial speed-up is possible. Surprisingly, our approach allows to recover mixed smoothness Sobolev functions belonging to $S^r_pW(\mathbb{T}^d)$ on the $d$-torus with a logarithmically better rate of convergence than any linear method can achieve when $1
期刊介绍:
Journal of Complex Networks publishes original articles and reviews with a significant contribution to the analysis and understanding of complex networks and its applications in diverse fields. Complex networks are loosely defined as networks with nontrivial topology and dynamics, which appear as the skeletons of complex systems in the real-world. The journal covers everything from the basic mathematical, physical and computational principles needed for studying complex networks to their applications leading to predictive models in molecular, biological, ecological, informational, engineering, social, technological and other systems. It includes, but is not limited to, the following topics: - Mathematical and numerical analysis of networks - Network theory and computer sciences - Structural analysis of networks - Dynamics on networks - Physical models on networks - Networks and epidemiology - Social, socio-economic and political networks - Ecological networks - Technological and infrastructural networks - Brain and tissue networks - Biological and molecular networks - Spatial networks - Techno-social networks i.e. online social networks, social networking sites, social media - Other applications of networks - Evolving networks - Multilayer networks - Game theory on networks - Biomedicine related networks - Animal social networks - Climate networks - Cognitive, language and informational network