Stephen Kobourov , Maarten Löffler , Fabrizio Montecchiani , Marcin Pilipczuk , Ignaz Rutter , Raimund Seidel , Manuel Sorge , Jules Wulms
{"title":"The influence of dimensions on the complexity of computing decision trees","authors":"Stephen Kobourov , Maarten Löffler , Fabrizio Montecchiani , Marcin Pilipczuk , Ignaz Rutter , Raimund Seidel , Manuel Sorge , Jules Wulms","doi":"10.1016/j.artint.2025.104322","DOIUrl":null,"url":null,"abstract":"<div><div>A decision tree recursively splits a feature space <span><math><msup><mrow><mi>R</mi></mrow><mrow><mi>d</mi></mrow></msup></math></span> and then assigns class labels based on the resulting partition. Decision trees have been part of the basic machine-learning toolkit for decades. A large body of work considers heuristic algorithms that compute a decision tree from training data, usually aiming to minimize in particular the size of the resulting tree. In contrast, little is known about the complexity of the underlying computational problem of computing a minimum-size tree for the given training data. We study this problem with respect to the number <em>d</em> of dimensions of the feature space <span><math><msup><mrow><mi>R</mi></mrow><mrow><mi>d</mi></mrow></msup></math></span>, which contains <em>n</em> training examples. We show that it can be solved in <span><math><mi>O</mi><mo>(</mo><msup><mrow><mi>n</mi></mrow><mrow><mn>2</mn><mi>d</mi><mo>+</mo><mn>1</mn></mrow></msup><mo>)</mo></math></span> time, but under reasonable complexity-theoretic assumptions it is not possible to achieve <span><math><mi>f</mi><mo>(</mo><mi>d</mi><mo>)</mo><mo>⋅</mo><msup><mrow><mi>n</mi></mrow><mrow><mi>o</mi><mo>(</mo><mi>d</mi><mo>/</mo><mi>log</mi><mo></mo><mi>d</mi><mo>)</mo></mrow></msup></math></span> running time. The problem is solvable in <span><math><msup><mrow><mo>(</mo><mi>d</mi><mi>R</mi><mo>)</mo></mrow><mrow><mi>O</mi><mo>(</mo><mi>d</mi><mi>R</mi><mo>)</mo></mrow></msup><mo>⋅</mo><msup><mrow><mi>n</mi></mrow><mrow><mn>1</mn><mo>+</mo><mi>o</mi><mo>(</mo><mn>1</mn><mo>)</mo></mrow></msup></math></span> time if there are exactly two classes and <em>R</em> is an upper bound on the number of tree leaves labeled with the first class.</div></div>","PeriodicalId":8434,"journal":{"name":"Artificial Intelligence","volume":"343 ","pages":"Article 104322"},"PeriodicalIF":5.1000,"publicationDate":"2025-03-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Artificial Intelligence","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0004370225000414","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
A decision tree recursively splits a feature space and then assigns class labels based on the resulting partition. Decision trees have been part of the basic machine-learning toolkit for decades. A large body of work considers heuristic algorithms that compute a decision tree from training data, usually aiming to minimize in particular the size of the resulting tree. In contrast, little is known about the complexity of the underlying computational problem of computing a minimum-size tree for the given training data. We study this problem with respect to the number d of dimensions of the feature space , which contains n training examples. We show that it can be solved in time, but under reasonable complexity-theoretic assumptions it is not possible to achieve running time. The problem is solvable in time if there are exactly two classes and R is an upper bound on the number of tree leaves labeled with the first class.
期刊介绍:
The Journal of Artificial Intelligence (AIJ) welcomes papers covering a broad spectrum of AI topics, including cognition, automated reasoning, computer vision, machine learning, and more. Papers should demonstrate advancements in AI and propose innovative approaches to AI problems. Additionally, the journal accepts papers describing AI applications, focusing on how new methods enhance performance rather than reiterating conventional approaches. In addition to regular papers, AIJ also accepts Research Notes, Research Field Reviews, Position Papers, Book Reviews, and summary papers on AI challenges and competitions.