M. Maathuis, M. Drton, S. Lauritzen, M. Wainwright
{"title":"Gaussian Graphical Models","authors":"M. Maathuis, M. Drton, S. Lauritzen, M. Wainwright","doi":"10.1201/9780429463976-9","DOIUrl":"https://doi.org/10.1201/9780429463976-9","url":null,"abstract":"This chapter describes graphical models for multivariate continuous data based on the Gaussian (normal) distribution. We gently introduce the undirected models by examining the partial correlation structure of two sets of data, one relating to meat composition of pig carcasses and the other to body fat measurements. We then give a concise exposition of the model theory, covering topics such as maximum likelihood estimation using the IPS algorithm, hypothesis testing, and decomposability. We also explain the close relation between the models and linear regression models. We describe various approaches to model selection, including stepwise selection, the glasso algorithm and the SIN algorithm and apply these to the example datasets. We then turn to directed Gaussian graphical models that can be represented as DAGs. We explain a key concept, Markov equivalence, and describe how certain mixed graphs called pDAGS and essential graphs are used to represent equivalence classes of models. We describe various model selection algorithms for directed Gaussian models, including PC algorithm, the hill-climbing algorithm, and the max-min hill-climbing algorithm and apply them to the example datasets. Finally, we briefly describe Gaussian chain graph models and illustrate use of a model selection algorithm for these models.","PeriodicalId":336063,"journal":{"name":"Handbook of Graphical Models","volume":"13 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-11-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115343774","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Algorithms and Data Structures for Exact Computation of Marginals","authors":"J. Bilmes","doi":"10.1201/9780429463976-4","DOIUrl":"https://doi.org/10.1201/9780429463976-4","url":null,"abstract":"","PeriodicalId":336063,"journal":{"name":"Handbook of Graphical Models","volume":"17 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-11-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122249452","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Mediation Analysis","authors":"Johan Steen, S. Vansteelandt","doi":"10.1201/9780429463976-17","DOIUrl":"https://doi.org/10.1201/9780429463976-17","url":null,"abstract":"Mediation analysis seeks to infer how much of the effect of an exposure on an outcome can be attributed to specific pathways via intermediate variables or mediators. This requires identification of so-called path-specific effects. These express how a change in exposure affects those intermediate variables (along certain pathways), and how the resulting changes in those variables in turn affect the outcome (along subsequent pathways). However, unlike identification of total effects, adjustment for confounding is insufficient for identification of path-specific effects because their magnitude is also determined by the extent to which individuals who experience large exposure effects on the mediator, tend to experience relatively small or large mediator effects on the outcome. This chapter therefore provides an accessible review of identification strategies under general nonparametric structural equation models (with possibly unmeasured variables), which rule out certain such dependencies. In particular, it is shown which path-specific effects can be identified under such models, and how this can be done.","PeriodicalId":336063,"journal":{"name":"Handbook of Graphical Models","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-01-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133408386","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Latent Tree Models","authors":"Piotr Zwiernik","doi":"10.1201/9780429463976-11","DOIUrl":"https://doi.org/10.1201/9780429463976-11","url":null,"abstract":"Latent tree models are graphical models defined on trees, in which only a subset of variables is observed. They were first discussed by Judea Pearl as tree-decomposable distributions to generalise star-decomposable distributions such as the latent class model. Latent tree models, or their submodels, are widely used in: phylogenetic analysis, network tomography, computer vision, causal modeling, and data clustering. They also contain other well-known classes of models like hidden Markov models, Brownian motion tree model, the Ising model on a tree, and many popular models used in phylogenetics. This article offers a concise introduction to the theory of latent tree models. We emphasise the role of tree metrics in the structural description of this model class, in designing learning algorithms, and in understanding fundamental limits of what and when can be learned.","PeriodicalId":336063,"journal":{"name":"Handbook of Graphical Models","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-08-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114069386","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Sequential Monte Carlo Methods","authors":"A. Doucet","doi":"10.1002/0471667196.ESS5089","DOIUrl":"https://doi.org/10.1002/0471667196.ESS5089","url":null,"abstract":"Algorithm 1 Bootstrap particle filter (for i = 1, . . . , N) 1. Initialization (t = 0): (a) Sample x i 0 ∼ p(x0). (b) Set initial weights: w i 0 = 1/N. 2. for t = 1 to T do (a) Resample: sample ancestor indices ai t ∼ C({w j t−1}j=1). (b) Propagate: sample x i t ∼ p(xt | x ai t t−1). x i 0:t = {x ai t 0:t−1, x i t}. (c) Weight: compute w̃ i t = p(yt | x i t) and normalize w i t = w̃ i t/ ∑N j=1 w̃ j t . The ancestor indices {ai t}i=1 allow us to keep track of exactly what happens in each resampling step. Note the bookkeeping added to the propagation step 2b. 2/22 Bookkeeping – ancestral path","PeriodicalId":336063,"journal":{"name":"Handbook of Graphical Models","volume":"40 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-08-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124101858","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}