{"title":"Decision Trees and Random Forests","authors":"Tom Rainforth","doi":"10.1002/9781119544678.ch10","DOIUrl":null,"url":null,"abstract":"Y = { 1 if X1 > 0.4 and X2 > 0.6 0 otherwise We construct the dataset: n <5000 x <cbind(runif(n), runif(n)) y <factor(ifelse(x[,1] > .4, x[,2] > .6, 0)) r <data.frame(x, y) We construct a decision tree for this using rpart: tree <rpart(y ~ X1 + X2, data = r, method = \"class\") printcp(tree) We can generate a simple diagram of this tree: plot(tree, compress = TRUE, mar = c(.2, .2, .2, .2)) text(tree, use.n = TRUE) We can generate predictions using this tree on data using the predict function. Here we generate a testing set the same way as the training set above and we find the accuracy of our classifier:","PeriodicalId":344200,"journal":{"name":"Condition Monitoring with Vibration Signals","volume":"55 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-12-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Condition Monitoring with Vibration Signals","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1002/9781119544678.ch10","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Y = { 1 if X1 > 0.4 and X2 > 0.6 0 otherwise We construct the dataset: n <5000 x .4, x[,2] > .6, 0)) r