arXiv: Computation最新文献

筛选
英文 中文
Double Happiness: Enhancing the Coupled Gains of L-lag Coupling via Control Variates. 双喜:通过控制变量增强l -滞后耦合的耦合增益。
arXiv: Computation Pub Date : 2020-08-28 DOI: 10.5705/SS.202020.0461
Radu V. Craiu, X. Meng
{"title":"Double Happiness: Enhancing the Coupled Gains of L-lag Coupling via Control Variates.","authors":"Radu V. Craiu, X. Meng","doi":"10.5705/SS.202020.0461","DOIUrl":"https://doi.org/10.5705/SS.202020.0461","url":null,"abstract":"The recently proposed L-lag coupling for unbiased MCMC citep{biswas2019estimating, jacob2020unbiased} calls for a joint celebration by MCMC practitioners and theoreticians. For practitioners, it circumvents the thorny issue of deciding the burn-in period or when to terminate an MCMC iteration, and opens the door for safe parallel implementation. For theoreticians, it provides a powerful tool to establish elegant and easily estimable bounds on the exact error of MCMC approximation at any finite number of iteration. A serendipitous observation about the bias correcting term led us to introduce naturally available control variates into the L-lag coupling estimators. In turn, this extension enhances the coupled gains of L-lag coupling, because it results in more efficient unbiased estimators as well as a better bound on the total variation error of MCMC iterations, albeit the gains diminish with the numerical value of L. Specifically, the new bound is theoretically guaranteed to never exceed the one given previously. We also argue that L-lag coupling represents a long sought after coupling for the future, breaking a logjam of the coupling-from-the-past type of perfect sampling, by reducing the generally un-achievable requirement of being textit{perfect} to being textit{unbiased}, a worthwhile trade-off for ease of implementation in most practical situations. The theoretical analysis is supported by numerical experiments that show tighter bounds and a gain in efficiency when control variates are introduced.","PeriodicalId":8446,"journal":{"name":"arXiv: Computation","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2020-08-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"84090470","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 5
SCOREDRIVENMODELS.JL: A JULIA PACKAGE FOR GENERALIZED AUTOREGRESSIVE SCORE MODELS SCOREDRIVENMODELS。一个用于广义自回归分数模型的Julia包
arXiv: Computation Pub Date : 2020-08-12 DOI: 10.17771/pucrio.acad.57291
Guilherme Bodin, Raphael Saavedra, C. Fernandes, A. Street
{"title":"SCOREDRIVENMODELS.JL: A JULIA PACKAGE FOR GENERALIZED AUTOREGRESSIVE SCORE MODELS","authors":"Guilherme Bodin, Raphael Saavedra, C. Fernandes, A. Street","doi":"10.17771/pucrio.acad.57291","DOIUrl":"https://doi.org/10.17771/pucrio.acad.57291","url":null,"abstract":"Score-driven models, also known as generalized autoregressive score models, represent a class of observation-driven time series models. They possess powerful properties, such as the ability to model different conditional distributions and to consider time-varying parameters within a flexible framework. In this paper, we present ScoreDrivenModels.jl, an open-source Julia package for modeling, forecasting, and simulating time series using the framework of score-driven models. The package is flexible with respect to model definition, allowing the user to specify the lag structure and which parameters are time-varying or constant. It is also possible to consider several distributions, including Beta, Exponential, Gamma, Lognormal, Normal, Poisson, Student's t, and Weibull. The provided interface is flexible, allowing interested users to implement any desired distribution and parametrization.","PeriodicalId":8446,"journal":{"name":"arXiv: Computation","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2020-08-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"90496658","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Simple conditions for convergence of sequential Monte Carlo genealogies with applications 序贯蒙特卡罗谱收敛的简单条件及其应用
arXiv: Computation Pub Date : 2020-06-30 DOI: 10.1214/20-ejp561
Suzie Brown, P. A. Jenkins, A. M. Johansen, Jere Koskela
{"title":"Simple conditions for convergence of sequential Monte Carlo genealogies with applications","authors":"Suzie Brown, P. A. Jenkins, A. M. Johansen, Jere Koskela","doi":"10.1214/20-ejp561","DOIUrl":"https://doi.org/10.1214/20-ejp561","url":null,"abstract":"Sequential Monte Carlo algorithms are popular methods for approximating integrals in problems such as non-linear filtering and smoothing. Their performance depends strongly on the properties of an induced genealogical process. We present simple conditions under which the limiting process, as the number of particles grows, is a time-rescaled Kingman coalescent. We establish these conditions for standard sequential Monte Carlo with a broad class of low-variance resampling schemes, as well as for conditional sequential Monte Carlo with multinomial resampling.","PeriodicalId":8446,"journal":{"name":"arXiv: Computation","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2020-06-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"73418886","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
Increasing the efficiency of Sequential Monte Carlo samplers through the use of approximately optimal L-kernels 通过使用近似最优l核来提高顺序蒙特卡罗采样器的效率
arXiv: Computation Pub Date : 2020-04-24 DOI: 10.1016/j.ymssp.2021.108028
P. L. Green, Robert E. Moore, Ryan J Jackson, Jinglai Li, S. Maskell
{"title":"Increasing the efficiency of Sequential Monte Carlo samplers through the use of approximately optimal L-kernels","authors":"P. L. Green, Robert E. Moore, Ryan J Jackson, Jinglai Li, S. Maskell","doi":"10.1016/j.ymssp.2021.108028","DOIUrl":"https://doi.org/10.1016/j.ymssp.2021.108028","url":null,"abstract":"","PeriodicalId":8446,"journal":{"name":"arXiv: Computation","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2020-04-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"76690681","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
Particle Methods for Stochastic Differential Equation Mixed Effects Models 随机微分方程混合效应模型的粒子方法
arXiv: Computation Pub Date : 2019-07-25 DOI: 10.1214/20-ba1216
Imke Botha, R. Kohn, C. Drovandi
{"title":"Particle Methods for Stochastic Differential Equation Mixed Effects Models","authors":"Imke Botha, R. Kohn, C. Drovandi","doi":"10.1214/20-ba1216","DOIUrl":"https://doi.org/10.1214/20-ba1216","url":null,"abstract":"Parameter inference for stochastic differential equation mixed effects models (SDEMEMs) is a challenging problem. Analytical solutions for these models are rarely available, which means that the likelihood is also intractable. In this case, exact inference is possible using the pseudo-marginal method, where the intractable likelihood is replaced by its nonnegative unbiased estimate. A useful application of this idea is particle MCMC, which uses a particle filter estimate of the likelihood. While the exact posterior is targeted by these methods, a naive implementation for SDEMEMs can be highly inefficient. We develop three extensions to the naive approach which exploits specific aspects of SDEMEMs and other advances such as correlated pseudo-marginal methods. We compare these methods on real and simulated data from a tumour xenography study on mice.","PeriodicalId":8446,"journal":{"name":"arXiv: Computation","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2019-07-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"78647733","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 17
Nested sampling on non-trivial geometries 非平凡几何上的嵌套抽样
arXiv: Computation Pub Date : 2019-05-22 DOI: 10.5281/zenodo.3653182
K. Javid
{"title":"Nested sampling on non-trivial geometries","authors":"K. Javid","doi":"10.5281/zenodo.3653182","DOIUrl":"https://doi.org/10.5281/zenodo.3653182","url":null,"abstract":"Metropolis nested sampling evolves a Markov chain from a current livepoint and accepts new points along the chain according to a version of the Metropolis acceptance ratio modified to satisfy the likelihood constraint, characteristic of nested sampling algorithms. The geometric nested sampling algorithm we present here is a based on the Metropolis method, but treats parameters as though they represent points on certain geometric objects, namely circles, tori and spheres. For parameters which represent points on a circle or torus, the trial distribution is `wrapped' around the domain of the posterior distribution such that samples cannot be rejected automatically when evaluating the Metropolis ratio due to being outside the sampling domain. Furthermore, this enhances the mobility of the sampler. For parameters which represent coordinates on the surface of a sphere, the algorithm transforms the parameters into a Cartesian coordinate system before sampling which again makes sure no samples are automatically rejected, and provides a physically intutive way of the sampling the parameter space. \u0000We apply the geometric nested sampler to two types of toy model which include circular, toroidal and spherical parameters. We find that the geometric nested sampler generally outperforms textsc{MultiNest} in both cases. %We also apply the algorithm to a gravitational wave detection model which includes circular and spherical parameters, and find that the geometric nested sampler and textsc{MultiNest} appear to perform equally well as one another. Our implementation of the algorithm can be found at url{this https URL}.","PeriodicalId":8446,"journal":{"name":"arXiv: Computation","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2019-05-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"75725631","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Pushing the Limit: A Hybrid Parallel Implementation of the Multi-resolution Approximation for Massive Data 突破极限:海量数据多分辨率近似的混合并行实现
arXiv: Computation Pub Date : 2019-04-30 DOI: 10.5065/nnt6-q689
Huang Huang, Lewis R. Blake, D. Hammerling
{"title":"Pushing the Limit: A Hybrid Parallel Implementation of the Multi-resolution Approximation for Massive Data","authors":"Huang Huang, Lewis R. Blake, D. Hammerling","doi":"10.5065/nnt6-q689","DOIUrl":"https://doi.org/10.5065/nnt6-q689","url":null,"abstract":"The multi-resolution approximation (MRA) of Gaussian processes was recently proposed to conduct likelihood-based inference for massive spatial data sets. An advantage of the methodology is that it can be parallelized. We implemented the MRA in C++ for both serial and parallel versions. In the parallel implementation, we use a hybrid parallelism that employs both distributed and shared memory computing for communications between and within nodes by using the Message Passing Interface (MPI) and OpenMP, respectively. The performance of the serial code is compared between the C++ and MATLAB implementations over a small data set on a personal laptop. The C++ parallel program is further carefully studied under different configurations by applications to data sets from around a tenth of a million to 47 million observations. We show the practicality of this implementation by demonstrating that we can get quick inference for massive real-world data sets. The serial and parallel C++ code can be found at this https URL.","PeriodicalId":8446,"journal":{"name":"arXiv: Computation","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2019-04-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"85795636","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 7
On the use of approximate Bayesian computation Markov chain Monte Carlo with inflated tolerance and post-correction 利用近似贝叶斯计算马尔可夫链蒙特卡罗膨胀容限和后校正
arXiv: Computation Pub Date : 2019-02-01 DOI: 10.1093/biomet/asz078
M. Vihola, Jordan Franks
{"title":"On the use of approximate Bayesian computation Markov chain Monte Carlo with inflated tolerance and post-correction","authors":"M. Vihola, Jordan Franks","doi":"10.1093/biomet/asz078","DOIUrl":"https://doi.org/10.1093/biomet/asz078","url":null,"abstract":"Approximate Bayesian computation allows for inference of complicated probabilistic models with intractable likelihoods using model simulations. The Markov chain Monte Carlo implementation of approximate Bayesian computation is often sensitive to the tolerance parameter: low tolerance leads to poor mixing and large tolerance entails excess bias. We consider an approach using a relatively large tolerance for the Markov chain Monte Carlo sampler to ensure its sufficient mixing, and post-processing the output leading to estimators for a range of finer tolerances. We introduce an approximate confidence interval for the related post-corrected estimators, and propose an adaptive approximate Bayesian computation Markov chain Monte Carlo, which finds a `balanced' tolerance level automatically, based on acceptance rate optimisation. Our experiments show that post-processing based estimators can perform better than direct Markov chain targetting a fine tolerance, that our confidence intervals are reliable, and that our adaptive algorithm leads to reliable inference with little user specification.","PeriodicalId":8446,"journal":{"name":"arXiv: Computation","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2019-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"85157685","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 10
Precision Annealing Monte Carlo Methods for Statistical Data Assimilation: Metropolis-Hastings Procedures 统计数据同化的精密退火蒙特卡罗方法:大都会-黑斯廷斯程序
arXiv: Computation Pub Date : 2019-01-14 DOI: 10.5194/NPG-2019-1
Adrian S. Wong, Kangbo Hao, Zheng Fang, H. Abarbanel
{"title":"Precision Annealing Monte Carlo Methods for Statistical Data Assimilation: Metropolis-Hastings Procedures","authors":"Adrian S. Wong, Kangbo Hao, Zheng Fang, H. Abarbanel","doi":"10.5194/NPG-2019-1","DOIUrl":"https://doi.org/10.5194/NPG-2019-1","url":null,"abstract":"Abstract. Statistical Data Assimilation (SDA) is the transfer of information from field or laboratory observations to a user selected model of the dynamical system producing those observations. The data is noisy and the model has errors; the information transfer addresses properties of the conditional probability distribution of the states of the model conditioned on the observations. The quantities of interest in SDA are the conditional expected values of functions of the model state, and these require the approximate evaluation of high dimensional integrals. We introduce a conditional probability distribution and use the Laplace method with annealing to identify the maxima of the conditional probability distribution. The annealing method slowly increases the precision term of the model as it enters the Laplace method. In this paper, we extend the idea of precision annealing (PA) to Monte Carlo calculations of conditional expected values using Metropolis-Hastings methods.\u0000","PeriodicalId":8446,"journal":{"name":"arXiv: Computation","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2019-01-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"82758601","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Sparse regression with Multi-type Regularized Feature modeling 基于多类型正则化特征建模的稀疏回归
arXiv: Computation Pub Date : 2018-10-07 DOI: 10.1016/j.insmatheco.2020.11.010
Sander Devriendt, Katrien Antonio, Tom Reynkens, Roel Verbelen
{"title":"Sparse regression with Multi-type Regularized Feature modeling","authors":"Sander Devriendt, Katrien Antonio, Tom Reynkens, Roel Verbelen","doi":"10.1016/j.insmatheco.2020.11.010","DOIUrl":"https://doi.org/10.1016/j.insmatheco.2020.11.010","url":null,"abstract":"","PeriodicalId":8446,"journal":{"name":"arXiv: Computation","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2018-10-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"90512921","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 20
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信