Bastian Pfeifer , Arne Gevaert , Markus Loecher , Andreas Holzinger
{"title":"Tree smoothing: Post-hoc regularization of tree ensembles for interpretable machine learning","authors":"Bastian Pfeifer , Arne Gevaert , Markus Loecher , Andreas Holzinger","doi":"10.1016/j.ins.2024.121564","DOIUrl":null,"url":null,"abstract":"<div><div>Random Forests (RFs) are powerful ensemble learning algorithms that are widely used in various machine learning tasks. However, they tend to overfit noisy or irrelevant features, which can result in decreased generalization performance. Post-hoc regularization techniques aim to solve this problem by modifying the structure of the learned ensemble after training. We propose a novel <em>post-hoc regularization via tree smoothing</em> for classification tasks to leverage the reliable class distributions closer to the root node whilst reducing the impact of more specific and potentially noisy splits deeper in the tree. Our novel approach allows for a form of pruning that does not alter the general structure of the trees, adjusting the influence of nodes based on their proximity to the root node. We evaluated the performance of our method on various machine learning benchmark data sets and on cancer data from The Cancer Genome Atlas (TCGA). Our approach demonstrates competitive performance compared to the state-of-the-art and, in the majority of cases, and outperforms it in most cases in terms of prediction accuracy, generalization, and interpretability.</div></div>","PeriodicalId":51063,"journal":{"name":"Information Sciences","volume":"690 ","pages":"Article 121564"},"PeriodicalIF":8.1000,"publicationDate":"2024-10-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Information Sciences","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0020025524014786","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"0","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
引用次数: 0
Abstract
Random Forests (RFs) are powerful ensemble learning algorithms that are widely used in various machine learning tasks. However, they tend to overfit noisy or irrelevant features, which can result in decreased generalization performance. Post-hoc regularization techniques aim to solve this problem by modifying the structure of the learned ensemble after training. We propose a novel post-hoc regularization via tree smoothing for classification tasks to leverage the reliable class distributions closer to the root node whilst reducing the impact of more specific and potentially noisy splits deeper in the tree. Our novel approach allows for a form of pruning that does not alter the general structure of the trees, adjusting the influence of nodes based on their proximity to the root node. We evaluated the performance of our method on various machine learning benchmark data sets and on cancer data from The Cancer Genome Atlas (TCGA). Our approach demonstrates competitive performance compared to the state-of-the-art and, in the majority of cases, and outperforms it in most cases in terms of prediction accuracy, generalization, and interpretability.
期刊介绍:
Informatics and Computer Science Intelligent Systems Applications is an esteemed international journal that focuses on publishing original and creative research findings in the field of information sciences. We also feature a limited number of timely tutorial and surveying contributions.
Our journal aims to cater to a diverse audience, including researchers, developers, managers, strategic planners, graduate students, and anyone interested in staying up-to-date with cutting-edge research in information science, knowledge engineering, and intelligent systems. While readers are expected to share a common interest in information science, they come from varying backgrounds such as engineering, mathematics, statistics, physics, computer science, cell biology, molecular biology, management science, cognitive science, neurobiology, behavioral sciences, and biochemistry.