{"title":"HOT: heuristics for oblique trees","authors":"V. Iyengar","doi":"10.1109/TAI.1999.809771","DOIUrl":null,"url":null,"abstract":"This paper presents a new method (HOT) of generating oblique decision trees. Oblique trees have been shown to be useful tools for classification in some problem domains, producing accurate and intuitive solutions. The method can be incorporated into a variety of existing decision tree tools and the paper illustrates this with two very distinct tree generators. The key idea is a method of learning oblique vectors and using the corresponding families of hyperplanes orthogonal to these vectors to separate regions with distinct dominant classes. Experimental results indicate that the learnt oblique hyperplanes lead to compact and accurate oblique trees. HOT is simple and easy to incorporate into most decision tree packages, yet its results compare well with much more complex schemes for generating oblique trees.","PeriodicalId":194023,"journal":{"name":"Proceedings 11th International Conference on Tools with Artificial Intelligence","volume":"67 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1999-11-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"11","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings 11th International Conference on Tools with Artificial Intelligence","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/TAI.1999.809771","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 11
Abstract
This paper presents a new method (HOT) of generating oblique decision trees. Oblique trees have been shown to be useful tools for classification in some problem domains, producing accurate and intuitive solutions. The method can be incorporated into a variety of existing decision tree tools and the paper illustrates this with two very distinct tree generators. The key idea is a method of learning oblique vectors and using the corresponding families of hyperplanes orthogonal to these vectors to separate regions with distinct dominant classes. Experimental results indicate that the learnt oblique hyperplanes lead to compact and accurate oblique trees. HOT is simple and easy to incorporate into most decision tree packages, yet its results compare well with much more complex schemes for generating oblique trees.