{"title":"Optimal Experimental Design Using A Consistent Bayesian Approach","authors":"Scott N. Walsh, T. Wildey, J. Jakeman","doi":"10.1115/1.4037457","DOIUrl":"https://doi.org/10.1115/1.4037457","url":null,"abstract":"We consider the utilization of a computational model to guide the optimal acquisition of experimental data to inform the stochastic description of model input parameters. Our formulation is based on the recently developed consistent Bayesian approach for solving stochastic inverse problems which seeks a posterior probability density that is consistent with the model and the data in the sense that the push-forward of the posterior (through the computational model) matches the observed density on the observations almost everywhere. Given a set a potential observations, our optimal experimental design (OED) seeks the observation, or set of observations, that maximizes the expected information gain from the prior probability density on the model parameters. We discuss the characterization of the space of observed densities and a computationally efficient approach for rescaling observed densities to satisfy the fundamental assumptions of the consistent Bayesian approach. Numerical results are presented to compare our approach with existing OED methodologies using the classical/statistical Bayesian approach and to demonstrate our OED on a set of representative PDE-based models.","PeriodicalId":8446,"journal":{"name":"arXiv: Computation","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2017-05-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"82551928","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Parsimonious Adaptive Rejection Sampling","authors":"Luca Martino","doi":"10.1049/EL.2017.1711","DOIUrl":"https://doi.org/10.1049/EL.2017.1711","url":null,"abstract":"Monte Carlo (MC) methods have become very popular in signal processing during the past decades. The adaptive rejection sampling (ARS) algorithms are well-known MC technique which draw efficiently independent samples from univariate target densities. The ARS schemes yield a sequence of proposal functions that converge toward the target, so that the probability of accepting a sample approaches one. However, sampling from the proposal pdf becomes more computationally demanding each time it is updated. We propose the Parsimonious Adaptive Rejection Sampling (PARS) method, where an efficient trade-off between acceptance rate and proposal complexity is obtained. Thus, the resulting algorithm is faster than the standard ARS approach.","PeriodicalId":8446,"journal":{"name":"arXiv: Computation","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2017-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"81044148","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Estimating Spatial Econometrics Models with Integrated Nested Laplace Approximation","authors":"V. Gómez‐Rubio, R. Bivand, H. Rue","doi":"10.3390/MATH9172044","DOIUrl":"https://doi.org/10.3390/MATH9172044","url":null,"abstract":"Integrated Nested Laplace Approximation provides a fast and effective method for marginal inference on Bayesian hierarchical models. This methodology has been implemented in the R-INLA package which permits INLA to be used from within R statistical software. Although INLA is implemented as a general methodology, its use in practice is limited to the models implemented in the R-INLA package. \u0000Spatial autoregressive models are widely used in spatial econometrics but have until now been missing from the R-INLA package. In this paper, we describe the implementation and application of a new class of latent models in INLA made available through R-INLA. This new latent class implements a standard spatial lag model, which is widely used and that can be used to build more complex models in spatial econometrics. \u0000The implementation of this latent model in R-INLA also means that all the other features of INLA can be used for model fitting, model selection and inference in spatial econometrics, as will be shown in this paper. Finally, we will illustrate the use of this new latent model and its applications with two datasets based on Gaussian and binary outcomes.","PeriodicalId":8446,"journal":{"name":"arXiv: Computation","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2017-03-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"83391046","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Agnieszka Kr'ol, A. Mauguen, Yassin Mazroui, Alexandre Laurent, S. Michiels, V. Rondeau
{"title":"Tutorial in Joint Modeling and Prediction: a Statistical Software for Correlated Longitudinal Outcomes, Recurrent Events and a Terminal Event","authors":"Agnieszka Kr'ol, A. Mauguen, Yassin Mazroui, Alexandre Laurent, S. Michiels, V. Rondeau","doi":"10.18637/jss.v081.i03","DOIUrl":"https://doi.org/10.18637/jss.v081.i03","url":null,"abstract":"Extensions in the field of joint modeling of correlated data and dynamic predictions improve the development of prognosis research. The R package frailtypack provides estimations of various joint models for longitudinal data and survival events. In particular, it fits models for recurrent events and a terminal event (frailtyPenal), models for two survival outcomes for clustered data (frailtyPenal), models for two types of recurrent events and a terminal event (multivPenal), models for a longitudinal biomarker and a terminal event (longiPenal) and models for a longitudinal biomarker, recurrent events and a terminal event (trivPenal). The estimators are obtained using a standard and penalized maximum likelihood approach, each model function allows to evaluate goodness-of-fit analyses and plots of baseline hazard functions. Finally, the package provides individual dynamic predictions of the terminal event and evaluation of predictive accuracy. This paper presents theoretical models with estimation techniques, applies the methods for predictions and illustrates frailtypack functions details with examples.","PeriodicalId":8446,"journal":{"name":"arXiv: Computation","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2017-01-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"85044450","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Calculating probabilistic excursion sets and related quantities using excursions","authors":"D. Bolin, F. Lindgren","doi":"10.18637/JSS.V086.I05","DOIUrl":"https://doi.org/10.18637/JSS.V086.I05","url":null,"abstract":"The R software package excursions contains methods for calculating probabilistic excursion sets, contour credible regions, and simultaneous confidence bands for latent Gaussian stochastic processes and fields. It also contains methods for uncertainty quantification of contour maps and computation of Gaussian integrals. This article describes the theoretical and computational methods used in the package. The main functions of the package are introduced and two examples illustrate how the package can be used.","PeriodicalId":8446,"journal":{"name":"arXiv: Computation","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2016-12-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"81840384","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
arXiv: ComputationPub Date : 2016-11-22DOI: 10.1007/978-981-10-5194-4_14
Zhongnan Jin, Yimeng Xie, Yili Hong, J. V. Mullekom
{"title":"ADDT : An R Package for Analysis of Accelerated Destructive Degradation Test Data","authors":"Zhongnan Jin, Yimeng Xie, Yili Hong, J. V. Mullekom","doi":"10.1007/978-981-10-5194-4_14","DOIUrl":"https://doi.org/10.1007/978-981-10-5194-4_14","url":null,"abstract":"","PeriodicalId":8446,"journal":{"name":"arXiv: Computation","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2016-11-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"81557823","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Local Kernel Dimension Reduction in Approximate Bayesian Computation","authors":"Jin Zhou, K. Fukumizu","doi":"10.4236/OJS.2018.83031","DOIUrl":"https://doi.org/10.4236/OJS.2018.83031","url":null,"abstract":"Approximate Bayesian Computation (ABC) is a popular sampling method in applications involving intractable likelihood functions. Without evaluating the likelihood function, ABC approximates the posterior distribution by the set of accepted samples which are simulated with parameters drawn from the prior distribution, where acceptance is determined by the distance between the summary statistics of the sample and the observation. The sufficiency and dimensionality of the summary statistics play a central role in the application of ABC. This paper proposes Local Gradient Kernel Dimension Reduction (LGKDR) to construct low dimensional summary statistics for ABC. The proposed method identifies a sufficient subspace of the original summary statistics by implicitly considers all nonlinear transforms therein, and a weighting kernel is used for the concentration of the projections. No strong assumptions are made on the marginal distributions nor the regression model, permitting usage in a wide range of applications. Experiments are done with both simple rejection ABC and sequential Monte Carlo ABC methods. Results are reported as competitive in the former and substantially better in the latter cases in which Monte Carlo errors are compressed as much as possible.","PeriodicalId":8446,"journal":{"name":"arXiv: Computation","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2016-09-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"77129459","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Fast Simulation of Hyperplane-Truncated Multivariate Normal Distributions","authors":"Yulai Cong, Bo Chen, Mingyuan Zhou","doi":"10.1214/17-BA1052","DOIUrl":"https://doi.org/10.1214/17-BA1052","url":null,"abstract":"We introduce a fast and easy-to-implement simulation algorithm for a multivariate normal distribution truncated on the intersection of a set of hyperplanes, and further generalize it to efficiently simulate random variables from a multivariate normal distribution whose covariance (precision) matrix can be decomposed as a positive-definite matrix minus (plus) a low-rank symmetric matrix. Example results illustrate the correctness and efficiency of the proposed simulation algorithms.","PeriodicalId":8446,"journal":{"name":"arXiv: Computation","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2016-07-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"80329910","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Kamiar Rahnama Rad, Timothy A. Machado, L. Paninski
{"title":"Robust and scalable Bayesian analysis of spatial neural tuning function data","authors":"Kamiar Rahnama Rad, Timothy A. Machado, L. Paninski","doi":"10.1214/16-AOAS996","DOIUrl":"https://doi.org/10.1214/16-AOAS996","url":null,"abstract":"A common analytical problem in neuroscience is the interpretation of neural activity with respect to sensory input or behavioral output. This is typically achieved by regressing measured neural activity against known stimuli or behavioral variables to produce a \"tuning function\" for each neuron. Unfortunately, because this approach handles neurons individually, it cannot take advantage of simultaneous measurements from spatially adjacent neurons that often have similar tuning properties. On the other hand, sharing information between adjacent neurons can errantly degrade estimates of tuning functions across space if there are sharp discontinuities in tuning between nearby neurons. In this paper, we develop a computationally efficient block Gibbs sampler that effectively pools information between neurons to de-noise tuning function estimates while simultaneously preserving sharp discontinuities that might exist in the organization of tuning across space. This method is fully Bayesian and its computational cost per iteration scales sub-quadratically with total parameter dimensionality. We demonstrate the robustness and scalability of this approach by applying it to both real and synthetic datasets. In particular, an application to data from the spinal cord illustrates that the proposed methods can dramatically decrease the experimental time required to accurately estimate tuning functions.","PeriodicalId":8446,"journal":{"name":"arXiv: Computation","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2016-06-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"84636232","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}