{"title":"Testing the validity of instrumental variables in just-identified linear non-Gaussian models.","authors":"Wolfgang Wiedermann, Dexin Shi","doi":"10.1111/bmsp.70000","DOIUrl":null,"url":null,"abstract":"<p><p>Instrumental variable (IV) estimation constitutes a powerful quasi-experimental tool to estimate causal effects in observational data. The IV approach, however, rests on two crucial assumptions-the instrument relevance assumption and the exclusion restriction assumption. The latter requirement (stating that the IV is not allowed to be related to the outcome via any path other than the one going through the predictor), cannot be empirically tested in just-identified models (i.e. models with as many IVs as predictors). The present study introduces properties of non-Gaussian IV models which enable one to test whether hidden confounding between an IV and the outcome is present. Detecting exclusion restriction violations due to a direct path between the IV and the outcome, however, is restricted to the over-identified case. Based on these insights, a two-step approach is presented to test IV validity against hidden confounding in just-identified models. The performance of the approach was evaluated using Monte-Carlo simulation experiments. An empirical example from psychological research is given to illustrate the approach in practice. Recommendations for best-practice applications and future research directions are discussed. Although the current study presents important insights for developing diagnostic procedures for IV models, sound universal IV validation in the just-identified case remains a challenging task.</p>","PeriodicalId":55322,"journal":{"name":"British Journal of Mathematical & Statistical Psychology","volume":" ","pages":""},"PeriodicalIF":1.8000,"publicationDate":"2025-06-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"British Journal of Mathematical & Statistical Psychology","FirstCategoryId":"102","ListUrlMain":"https://doi.org/10.1111/bmsp.70000","RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"MATHEMATICS, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
引用次数: 0
Abstract
Instrumental variable (IV) estimation constitutes a powerful quasi-experimental tool to estimate causal effects in observational data. The IV approach, however, rests on two crucial assumptions-the instrument relevance assumption and the exclusion restriction assumption. The latter requirement (stating that the IV is not allowed to be related to the outcome via any path other than the one going through the predictor), cannot be empirically tested in just-identified models (i.e. models with as many IVs as predictors). The present study introduces properties of non-Gaussian IV models which enable one to test whether hidden confounding between an IV and the outcome is present. Detecting exclusion restriction violations due to a direct path between the IV and the outcome, however, is restricted to the over-identified case. Based on these insights, a two-step approach is presented to test IV validity against hidden confounding in just-identified models. The performance of the approach was evaluated using Monte-Carlo simulation experiments. An empirical example from psychological research is given to illustrate the approach in practice. Recommendations for best-practice applications and future research directions are discussed. Although the current study presents important insights for developing diagnostic procedures for IV models, sound universal IV validation in the just-identified case remains a challenging task.
期刊介绍:
The British Journal of Mathematical and Statistical Psychology publishes articles relating to areas of psychology which have a greater mathematical or statistical aspect of their argument than is usually acceptable to other journals including:
• mathematical psychology
• statistics
• psychometrics
• decision making
• psychophysics
• classification
• relevant areas of mathematics, computing and computer software
These include articles that address substantitive psychological issues or that develop and extend techniques useful to psychologists. New models for psychological processes, new approaches to existing data, critiques of existing models and improved algorithms for estimating the parameters of a model are examples of articles which may be favoured.