{"title":"Bivariate splines in piecewise constant tension","authors":"Kunimitsu Takahashi, M. Kamada","doi":"10.1109/SAMPTA.2015.7148901","DOIUrl":"https://doi.org/10.1109/SAMPTA.2015.7148901","url":null,"abstract":"An extension of the bivariate cubic spline on the uniform grid is derived in this paper to have different tensions in different square cells of the grid. The resulting function can be interpreted also as a bivariate extension of the univariate spline in piecewise constant tension which was applied to adaptive interpolation of digital images for their magnification and rotation. The bivariate function will hopefully make it possible to magnify and rotate images better and even to deform images into any shapes. A locally supported basis, which is crucial for the practical use of the bivariate functions, has not been constructed at the moment and its construction is left for the next step of study.","PeriodicalId":311830,"journal":{"name":"2015 International Conference on Sampling Theory and Applications (SampTA)","volume":"20 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-05-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114810189","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"On boundedness inequalities of some semi-discrete operators in connection with sampling operators","authors":"A. Kivinukk, Tarmo Metsmagi","doi":"10.1109/SAMPTA.2015.7148848","DOIUrl":"https://doi.org/10.1109/SAMPTA.2015.7148848","url":null,"abstract":"The main aim of this paper is to study some boundedness inequalities of certain semi-discrete operators. These operators allow to unify some inequalities for both the Shannon sampling operators and Kantorovich-type operators.","PeriodicalId":311830,"journal":{"name":"2015 International Conference on Sampling Theory and Applications (SampTA)","volume":"30 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-05-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114863766","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Phase retrieval without small-ball probability assumptions: Recovery guarantees for phaselift","authors":"F. Krahmer, Yi-Kai Liu","doi":"10.1109/SAMPTA.2015.7148966","DOIUrl":"https://doi.org/10.1109/SAMPTA.2015.7148966","url":null,"abstract":"We study the problem of recovering an unknown vector x ε R<sup>n</sup> from measurements of the form y<sub>i</sub> = |a<sup>T</sup><sub>i</sub> x|<sup>2</sup> (for i = 1,..., m), where the vectors a<sub>i</sub> ε R<sup>n</sup> are chosen independently at random, with each coordinate a<sub>ij</sub> ε R being chosen independently from a fixed sub-Gaussian distribution D. However, without making additional assumptions on the random variables a<sub>ij</sub> - for example on the behavior of their small ball probabilities - it may happen some vectors x cannot be uniquely recovered. We show that for any sub-Gaussian distribution V, with no additional assumptions, it is still possible to recover most vectors x. More precisely, one can recover those vectors x that are not too peaky in the sense that at most a constant fraction of their mass is concentrated on any one coordinate. The recovery guarantees in this paper are for the PhaseLift algorithm, a tractable convex program based on a matrix formulation of the problem. We prove uniform recovery of all not too peaky vectors from m = 0(n) measurements, in the presence of noise. This extends previous work on PhaseLift by Candès and Li [8].","PeriodicalId":311830,"journal":{"name":"2015 International Conference on Sampling Theory and Applications (SampTA)","volume":"34 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-05-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124940169","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"On a lower bound for a periodic uncertainty constant","authors":"E. Lebedeva","doi":"10.1109/SAMPTA.2015.7148964","DOIUrl":"https://doi.org/10.1109/SAMPTA.2015.7148964","url":null,"abstract":"An inequality refining the lower bound for a periodic (Breitenberger) uncertainty constant is proved for a wide class of functions. A formula connecting uncertainty constants for periodic and non-periodic functions is extended to this class.","PeriodicalId":311830,"journal":{"name":"2015 International Conference on Sampling Theory and Applications (SampTA)","volume":"27 5 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-05-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122127159","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Nonuniform sparse recovery with random convolutions","authors":"David James, H. Rauhut","doi":"10.1109/SAMPTA.2015.7148845","DOIUrl":"https://doi.org/10.1109/SAMPTA.2015.7148845","url":null,"abstract":"We discuss the use of random convolutions for Compressed Sensing applications. In particular, we will show that after convolving an N-dimensional, s-sparse signal with a Rademacher or Steinhaus sequence, it can be recovered via l1-minimization using only m ≳ s log(N/ε) arbitrary chosen samples with probability at least 1 - ε.","PeriodicalId":311830,"journal":{"name":"2015 International Conference on Sampling Theory and Applications (SampTA)","volume":"22 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-05-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122132755","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A novel geometric multiscale approach to structured dictionary learning on high dimensional data","authors":"Guangliang Chen","doi":"10.1109/SAMPTA.2015.7148961","DOIUrl":"https://doi.org/10.1109/SAMPTA.2015.7148961","url":null,"abstract":"Adaptive dictionary learning has become a hot-topic research field during the past decade. Though several algorithms have been proposed and achieved impressive results, they are all computationally intensive due to the lack of structure in their output dictionaries. In this paper we build upon our previous work and take a geometric approach to develop better, more efficient algorithms that can learn adaptive structured dictionaries. While inheriting many of the advantages in the previous construction, the new algorithm better utilizes the geometry of data and effectively removes translational invariances from the data, thus able to produce smaller, more robust dictionaries. We demonstrate the performance of the new algorithm on two data sets, and conclude the paper by a discussion of future work.","PeriodicalId":311830,"journal":{"name":"2015 International Conference on Sampling Theory and Applications (SampTA)","volume":"22 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-05-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125568177","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"From paley graphs to deterministic sensing matrices with real-valued Gramians","authors":"A. Amini, Hamed Bagh-Sheikhi, F. Marvasti","doi":"10.1109/SAMPTA.2015.7148915","DOIUrl":"https://doi.org/10.1109/SAMPTA.2015.7148915","url":null,"abstract":"The performance guarantees in recovery of a sparse vector in a compressed sensing scenario, besides the reconstruction technique, depends on the choice of the sensing matrix. The so-called restricted isometry property (RIP) is one of the well-used tools to determine and compare the performance of various sensing matrices. It is a standard result that random (Gaussian) matrices satisfy RIP with high probability. However, the design of deterministic matrices that satisfy RIP has been a great challenge for many years now. The common design technique is through the coherence value (maximum modulus correlation between the columns). In this paper, based on the Paley graphs, we introduce deterministic matrices of size q+1/2 × q with q a prime power, such that the corresponding Gram matrix is real-valued. We show that the coherence of these matrices are less than twice the Welch bound, which is a lower bound valid for general matrices. It should be mentioned that the introduced matrix differs from the equiangular tight frame (ETF) of size q-1/2 × q arising from the Paley difference set.","PeriodicalId":311830,"journal":{"name":"2015 International Conference on Sampling Theory and Applications (SampTA)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-05-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129334422","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Compressibility of symmetric-α-stable processes","authors":"J. P. Ward, J. Fageot, M. Unser","doi":"10.1109/SAMPTA.2015.7148887","DOIUrl":"https://doi.org/10.1109/SAMPTA.2015.7148887","url":null,"abstract":"Within a deterministic framework, it is well known that n-term wavelet approximation rates of functions can be deduced from their Besov regularity. We use this principle to determine approximation rates for symmetric-α-stable (SαS) stochastic processes. First, we characterize the Besov regularity of SαS processes. Then the n-term approximation rates follow. To capture the local smoothness behavior, we consider sparse processes defined on the circle that are solutions of stochastic differential equations.","PeriodicalId":311830,"journal":{"name":"2015 International Conference on Sampling Theory and Applications (SampTA)","volume":"43 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-05-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123764416","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Blind compressed sensing using sparsifying transforms","authors":"S. Ravishankar, Y. Bresler","doi":"10.1109/SAMPTA.2015.7148944","DOIUrl":"https://doi.org/10.1109/SAMPTA.2015.7148944","url":null,"abstract":"Compressed sensing exploits the sparsity of images or image patches in a transform domain or synthesis dictionary to reconstruct images from undersampled measurements. In this work, we focus on blind compressed sensing, where the underlying sparsifying transform is a priori unknown, and propose a framework to simultaneously reconstruct both the image and the transform from highly undersampled measurements. The proposed block coordinate descent type algorithm involves efficient updates. Importantly, we prove that although the proposed formulation is highly nonconvex, our algorithm converges to the set of critical points of the objective defining the formulation. We illustrate the promise of the proposed framework for magnetic resonance image reconstruction from highly undersampled k-space measurements. As compared to previous methods involving fixed sparsifying transforms, or adaptive synthesis dictionaries, our approach is much faster, while also providing promising image reconstructions.","PeriodicalId":311830,"journal":{"name":"2015 International Conference on Sampling Theory and Applications (SampTA)","volume":"99 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-05-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127346729","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"On random and deterministic compressed sensing and the Restricted Isometry Property in levels","authors":"Alexander Bastounis, A. Hansen","doi":"10.1109/SAMPTA.2015.7148900","DOIUrl":"https://doi.org/10.1109/SAMPTA.2015.7148900","url":null,"abstract":"Compressed sensing (CS) is one of the great successes of computational mathematics in the past decade. There are a collection of tools which aim to mathematically describe compressed sensing when the sampling pattern is taken in a random or deterministic way. Unfortunately, there are many practical applications where the well studied concepts of uniform recovery and the Restricted Isometry Property (RIP) can be shown to be insufficient explanations for the success of compressed sensing. This occurs both when the sampling pattern is taken using a deterministic or a non-deterministic method. We shall study this phenomenon and explain why the RIP is absent, and then propose an adaptation which we term `the RIP in levels' which aims to solve the issues surrounding the RIP. The paper ends by conjecturing that the RIP in levels could provide a collection of results for deterministic sampling patterns.","PeriodicalId":311830,"journal":{"name":"2015 International Conference on Sampling Theory and Applications (SampTA)","volume":"189 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-05-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133865485","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}