{"title":"Approximate Matching","authors":"Ben Langmead","doi":"10.12987/9780300255881-017","DOIUrl":"https://doi.org/10.12987/9780300255881-017","url":null,"abstract":"You are free to use these slides. If you do, please sign the guestbook (www.langmead-lab.org/teaching-materials), or email me (ben.langmead@gmail.com) and tell me briey how you're using them. For original Keynote les, email me.","PeriodicalId":48576,"journal":{"name":"Journal of Causal Inference","volume":"11 1","pages":""},"PeriodicalIF":1.4,"publicationDate":"2021-01-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"81736588","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Radical empiricism and machine learning research","authors":"J. Pearl","doi":"10.1515/jci-2021-0006","DOIUrl":"https://doi.org/10.1515/jci-2021-0006","url":null,"abstract":"Abstract I contrast the “data fitting” vs “data interpreting” approaches to data science along three dimensions: Expediency, Transparency, and Explainability. “Data fitting” is driven by the faith that the secret to rational decisions lies in the data itself. In contrast, the data-interpreting school views data, not as a sole source of knowledge but as an auxiliary means for interpreting reality, and “reality” stands for the processes that generate the data. I argue for restoring balance to data science through a task-dependent symbiosis of fitting and interpreting, guided by the Logic of Causation.","PeriodicalId":48576,"journal":{"name":"Journal of Causal Inference","volume":"47 1","pages":"78 - 82"},"PeriodicalIF":1.4,"publicationDate":"2021-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"76849388","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Identification of causal intervention effects under contagion.","authors":"Xiaoxuan Cai, Wen Wei Loh, Forrest W Crawford","doi":"10.1515/jci-2019-0033","DOIUrl":"10.1515/jci-2019-0033","url":null,"abstract":"<p><p>Defining and identifying causal intervention effects for transmissible infectious disease outcomes is challenging because a treatment - such as a vaccine - given to one individual may affect the infection outcomes of others. Epidemiologists have proposed causal estimands to quantify effects of interventions under contagion using a two-person partnership model. These simple conceptual models have helped researchers develop causal estimands relevant to clinical evaluation of vaccine effects. However, many of these partnership models are formulated under structural assumptions that preclude realistic infectious disease transmission dynamics, limiting their conceptual usefulness in defining and identifying causal treatment effects in empirical intervention trials. In this paper, we propose causal intervention effects in two-person partnerships under arbitrary infectious disease transmission dynamics, and give nonparametric identification results showing how effects can be estimated in empirical trials using time-to-infection or binary outcome data. The key insight is that contagion is a causal phenomenon that induces conditional independencies on infection outcomes that can be exploited for the identification of clinically meaningful causal estimands. These new estimands are compared to existing quantities, and results are illustrated using a realistic simulation of an HIV vaccine trial.</p>","PeriodicalId":48576,"journal":{"name":"Journal of Causal Inference","volume":"9 1","pages":"9-38"},"PeriodicalIF":1.4,"publicationDate":"2021-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8528235/pdf/nihms-1684027.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"39541807","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
J. Peña, Sourabh Vivek Balgi, A. Sjölander, E. Gabriel
{"title":"On the bias of adjusting for a non-differentially mismeasured discrete confounder","authors":"J. Peña, Sourabh Vivek Balgi, A. Sjölander, E. Gabriel","doi":"10.1515/jci-2021-0033","DOIUrl":"https://doi.org/10.1515/jci-2021-0033","url":null,"abstract":"Abstract Biological and epidemiological phenomena are often measured with error or imperfectly captured in data. When the true state of this imperfect measure is a confounder of an outcome exposure relationship of interest, it was previously widely believed that adjustment for the mismeasured observed variables provides a less biased estimate of the true average causal effect than not adjusting. However, this is not always the case and depends on both the nature of the measurement and confounding. We describe two sets of conditions under which adjusting for a non-deferentially mismeasured proxy comes closer to the unidentifiable true average causal effect than the unadjusted or crude estimate. The first set of conditions apply when the exposure is discrete or continuous and the confounder is ordinal, and the expectation of the outcome is monotonic in the confounder for both treatment levels contrasted. The second set of conditions apply when the exposure and the confounder are categorical (nominal). In all settings, the mismeasurement must be non-differential, as differential mismeasurement, particularly an unknown pattern, can cause unpredictable results.","PeriodicalId":48576,"journal":{"name":"Journal of Causal Inference","volume":"265 1","pages":"229 - 249"},"PeriodicalIF":1.4,"publicationDate":"2021-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"72830204","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Causal versions of maximum entropy and principle of insufficient reason","authors":"D. Janzing","doi":"10.1515/jci-2021-0022","DOIUrl":"https://doi.org/10.1515/jci-2021-0022","url":null,"abstract":"Abstract The principle of insufficient reason (PIR) assigns equal probabilities to each alternative of a random experiment whenever there is no reason to prefer one over the other. The maximum entropy principle (MaxEnt) generalizes PIR to the case where statistical information like expectations are given. It is known that both principles result in paradoxical probability updates for joint distributions of cause and effect. This is because constraints on the conditional P ( effect ∣ cause ) Pleft({rm{effect}}| {rm{cause}}) result in changes of P ( cause ) Pleft({rm{cause}}) that assign higher probability to those values of the cause that offer more options for the effect, suggesting “intentional behavior.” Earlier work therefore suggested sequentially maximizing (conditional) entropy according to the causal order, but without further justification apart from plausibility on toy examples. We justify causal modifications of PIR and MaxEnt by separating constraints into restrictions for the cause and restrictions for the mechanism that generates the effect from the cause. We further sketch why causal PIR also entails “Information Geometric Causal Inference.” We briefly discuss problems of generalizing the causal version of MaxEnt to arbitrary causal DAGs.","PeriodicalId":48576,"journal":{"name":"Journal of Causal Inference","volume":"73 1","pages":"285 - 301"},"PeriodicalIF":1.4,"publicationDate":"2021-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"76776803","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Designing experiments informed by observational studies","authors":"Evan T. R. Rosenman, A. Owen","doi":"10.1515/jci-2021-0010","DOIUrl":"https://doi.org/10.1515/jci-2021-0010","url":null,"abstract":"Abstract The increasing availability of passively observed data has yielded a growing interest in “data fusion” methods, which involve merging data from observational and experimental sources to draw causal conclusions. Such methods often require a precarious tradeoff between the unknown bias in the observational dataset and the often-large variance in the experimental dataset. We propose an alternative approach, which avoids this tradeoff: rather than using observational data for inference, we use it to design a more efficient experiment. We consider the case of a stratified experiment with a binary outcome and suppose pilot estimates for the stratum potential outcome variances can be obtained from the observational study. We extend existing results to generate confidence sets for these variances, while accounting for the possibility of unmeasured confounding. Then, we pose the experimental design problem as a regret minimization problem subject to the constraints imposed by our confidence sets. We show that this problem can be converted into a concave maximization and solved using conventional methods. Finally, we demonstrate the practical utility of our methods using data from the Women’s Health Initiative.","PeriodicalId":48576,"journal":{"name":"Journal of Causal Inference","volume":"51 1","pages":"147 - 171"},"PeriodicalIF":1.4,"publicationDate":"2021-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"83776795","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}