Armin Eftekhari, M. Wakin, Ping Li, P. Constantine
{"title":"Randomized learning of the second-moment matrix of a smooth function","authors":"Armin Eftekhari, M. Wakin, Ping Li, P. Constantine","doi":"10.3934/fods.2019015","DOIUrl":"https://doi.org/10.3934/fods.2019015","url":null,"abstract":"Consider an open set $mathbb{D}subseteqmathbb{R}^n$, equipped with a probability measure $mu$. An important characteristic of a smooth function $f:mathbb{D}rightarrowmathbb{R}$ is its emph{second-moment matrix} $Sigma_{mu}:=int nabla f(x) nabla f(x)^* mu(dx) inmathbb{R}^{ntimes n}$, where $nabla f(x)inmathbb{R}^n$ is the gradient of $f(cdot)$ at $xinmathbb{D}$ and $*$ stands for transpose. For instance, the span of the leading $r$ eigenvectors of $Sigma_{mu}$ forms an emph{active subspace} of $f(cdot)$, which contains the directions along which $f(cdot)$ changes the most and is of particular interest in emph{ridge approximation}. In this work, we propose a simple algorithm for estimating $Sigma_{mu}$ from random point evaluations of $f(cdot)$ emph{without} imposing any structural assumptions on $Sigma_{mu}$. Theoretical guarantees for this algorithm are established with the aid of the same technical tools that have proved valuable in the context of covariance matrix estimation from partial measurements.","PeriodicalId":73054,"journal":{"name":"Foundations of data science (Springfield, Mo.)","volume":"1 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2016-12-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"70247801","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Geometric adaptive Monte Carlo in random environment","authors":"T. Papamarkou, Alexey Lindo, E. Ford","doi":"10.3934/FODS.2021014","DOIUrl":"https://doi.org/10.3934/FODS.2021014","url":null,"abstract":"Manifold Markov chain Monte Carlo algorithms have been introduced to sample more effectively from challenging target densities exhibiting multiple modes or strong correlations. Such algorithms exploit the local geometry of the parameter space, thus enabling chains to achieve a faster convergence rate when measured in number of steps. However, acquiring local geometric information can often increase computational complexity per step to the extent that sampling from high-dimensional targets becomes inefficient in terms of total computational time. This paper analyzes the computational complexity of manifold Langevin Monte Carlo and proposes a geometric adaptive Monte Carlo sampler aimed at balancing the benefits of exploiting local geometry with computational cost to achieve a high effective sample size for a given computational cost. The suggested sampler is a discrete-time stochastic process in random environment. The random environment allows to switch between local geometric and adaptive proposal kernels with the help of a schedule. An exponential schedule is put forward that enables more frequent use of geometric information in early transient phases of the chain, while saving computational time in late stationary phases. The average complexity can be manually set depending on the need for geometric exploitation posed by the underlying model.","PeriodicalId":73054,"journal":{"name":"Foundations of data science (Springfield, Mo.)","volume":"1 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2016-08-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"70248343","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Consistent manifold representation for topological data analysis","authors":"Tyrus Berry, T. Sauer","doi":"10.3934/FODS.2019001","DOIUrl":"https://doi.org/10.3934/FODS.2019001","url":null,"abstract":"For data sampled from an arbitrary density on a manifold embedded in Euclidean space, the Continuous k-Nearest Neighbors (CkNN) graph construction is introduced. It is shown that CkNN is geometrically consistent in the sense that under certain conditions, the unnormalized graph Laplacian converges to the Laplace-Beltrami operator, spectrally as well as pointwise. It is proved for compact (and conjectured for noncompact) manifolds that CkNN is the unique unweighted construction that yields a geometry consistent with the connected components of the underlying manifold in the limit of large data. Thus CkNN produces a single graph that captures all topological features simultaneously, in contrast to persistent homology, which represents each homology generator at a separate scale. As applications we derive a new fast clustering algorithm and a method to identify patterns in natural images topologically. Finally, we conjecture that CkNN is topologically consistent, meaning that the homology of the Vietoris-Rips complex (implied by the graph Laplacian) converges to the homology of the underlying manifold (implied by the Laplace-de Rham operators) in the limit of large data.","PeriodicalId":73054,"journal":{"name":"Foundations of data science (Springfield, Mo.)","volume":"1 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2016-06-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"70247699","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Flexible online multivariate regression with variational Bayes and the matrix-variate Dirichlet process","authors":"Meng Hwee Victor Ong, D. Nott, A. Jasra","doi":"10.3934/FODS.2019006","DOIUrl":"https://doi.org/10.3934/FODS.2019006","url":null,"abstract":"Flexible regression methods where interest centres on the way that the whole distribution of a response vector changes with covariates are very useful in some applications. A recently developed technique in this regard uses the matrix-variate Dirichlet process as a prior for a mixing distribution on a coefficient in a multivariate linear regression model. The method is attractive, particularly in the multivariate setting, for the convenient way that it allows for borrowing strength across different component regressions and for its computational simplicity and tractability. The purpose of the present article is to develop fast online variational Bayes approaches to fitting this model and to investigate how they perform compared to MCMC and batch variational methods in a number of scenarios.","PeriodicalId":73054,"journal":{"name":"Foundations of data science (Springfield, Mo.)","volume":"1 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2016-02-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"70247750","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Marco Banterle, C. Grazian, Anthony Lee, C. Robert
{"title":"Accelerating Metropolis-Hastings algorithms by Delayed Acceptance","authors":"Marco Banterle, C. Grazian, Anthony Lee, C. Robert","doi":"10.3934/FODS.2019005","DOIUrl":"https://doi.org/10.3934/FODS.2019005","url":null,"abstract":"MCMC algorithms such as Metropolis-Hastings algorithms are slowed down by the computation of complex target distributions as exemplified by huge datasets. We offer in this paper a useful generalisation of the Delayed Acceptance approach, devised to reduce the computational costs of such algorithms by a simple and universal divide-and-conquer strategy. The idea behind the generic acceleration is to divide the acceptance step into several parts, aiming at a major reduction in computing time that out-ranks the corresponding reduction in acceptance probability. Each of the components can be sequentially compared with a uniform variate, the first rejection signalling that the proposed value is considered no further. We develop moreover theoretical bounds for the variance of associated estimators with respect to the variance of the standard Metropolis-Hastings and detail some results on optimal scaling and general optimisation of the procedure. We illustrate those accelerating features on a series of examples","PeriodicalId":73054,"journal":{"name":"Foundations of data science (Springfield, Mo.)","volume":"1 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2015-03-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"70247738","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}