Wanqi Zhou , Shuanghao Bai , Yuqing Xie , Yicong He , Qibin Zhao , Badong Chen
{"title":"An information-theoretic approach for heterogeneous differentiable causal discovery","authors":"Wanqi Zhou , Shuanghao Bai , Yuqing Xie , Yicong He , Qibin Zhao , Badong Chen","doi":"10.1016/j.neunet.2025.107417","DOIUrl":null,"url":null,"abstract":"<div><div>With the advancement of deep learning, a variety of differential causal discovery methods have emerged, inevitably attracting more attention for their excellent scalability and interpretability. However, these methods often struggle with complex heterogeneous datasets that exhibit environmental diversity and are characterized by shifts in noise distribution. To this end, we introduce a novel information-theoretic approach designed to enhance the robustness of differential causal discovery methods. Specifically, we integrate Minimum Error Entropy (MEE) as an adaptive error regulator into the structure learning framework. MEE effectively reduces error variability across diverse samples, enabling our model to adapt dynamically to varying levels of complexity and noise. This adjustment significantly improves the precision and stability of the model. Extensive experiments on both synthetic and real-world datasets have demonstrated significant performance enhancements over existing methods, affirming the effectiveness of our approach. The code is available at <span><span>https://github.com/ElleZWQ/MHCD</span><svg><path></path></svg></span>.</div></div>","PeriodicalId":49763,"journal":{"name":"Neural Networks","volume":"188 ","pages":"Article 107417"},"PeriodicalIF":6.0000,"publicationDate":"2025-03-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Networks","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0893608025002965","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
With the advancement of deep learning, a variety of differential causal discovery methods have emerged, inevitably attracting more attention for their excellent scalability and interpretability. However, these methods often struggle with complex heterogeneous datasets that exhibit environmental diversity and are characterized by shifts in noise distribution. To this end, we introduce a novel information-theoretic approach designed to enhance the robustness of differential causal discovery methods. Specifically, we integrate Minimum Error Entropy (MEE) as an adaptive error regulator into the structure learning framework. MEE effectively reduces error variability across diverse samples, enabling our model to adapt dynamically to varying levels of complexity and noise. This adjustment significantly improves the precision and stability of the model. Extensive experiments on both synthetic and real-world datasets have demonstrated significant performance enhancements over existing methods, affirming the effectiveness of our approach. The code is available at https://github.com/ElleZWQ/MHCD.
期刊介绍:
Neural Networks is a platform that aims to foster an international community of scholars and practitioners interested in neural networks, deep learning, and other approaches to artificial intelligence and machine learning. Our journal invites submissions covering various aspects of neural networks research, from computational neuroscience and cognitive modeling to mathematical analyses and engineering applications. By providing a forum for interdisciplinary discussions between biology and technology, we aim to encourage the development of biologically-inspired artificial intelligence.