Longjun Huang, Minghe Huang, Bin Guo, Zhiming Zhuang
{"title":"基于粗糙集理论的决策树构造新方法","authors":"Longjun Huang, Minghe Huang, Bin Guo, Zhiming Zhuang","doi":"10.1109/GrC.2007.13","DOIUrl":null,"url":null,"abstract":"One of the keys to constructing decision tree model is to choose standard for testing attribute, for the criteria of selecting test attributes influences the classification accuracy of the tree. There exists diversity choosing standards for testing attribute based on entropy, Bayesian, and so on. In this paper, the degree of dependency of decision attribute on condition attribute, based on rough set theory, is used as a heuristic for selecting the attribute that will best separate the samples into individual classes. The results of example and experiments show that compared with the entropy-based approach, our approach is a better way to select nodes for constructing decision tree.","PeriodicalId":259430,"journal":{"name":"2007 IEEE International Conference on Granular Computing (GRC 2007)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2007-11-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"8","resultStr":"{\"title\":\"A New Method for Constructing Decision Tree Based on Rough Set Theory\",\"authors\":\"Longjun Huang, Minghe Huang, Bin Guo, Zhiming Zhuang\",\"doi\":\"10.1109/GrC.2007.13\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"One of the keys to constructing decision tree model is to choose standard for testing attribute, for the criteria of selecting test attributes influences the classification accuracy of the tree. There exists diversity choosing standards for testing attribute based on entropy, Bayesian, and so on. In this paper, the degree of dependency of decision attribute on condition attribute, based on rough set theory, is used as a heuristic for selecting the attribute that will best separate the samples into individual classes. The results of example and experiments show that compared with the entropy-based approach, our approach is a better way to select nodes for constructing decision tree.\",\"PeriodicalId\":259430,\"journal\":{\"name\":\"2007 IEEE International Conference on Granular Computing (GRC 2007)\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2007-11-02\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"8\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2007 IEEE International Conference on Granular Computing (GRC 2007)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/GrC.2007.13\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2007 IEEE International Conference on Granular Computing (GRC 2007)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/GrC.2007.13","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
A New Method for Constructing Decision Tree Based on Rough Set Theory
One of the keys to constructing decision tree model is to choose standard for testing attribute, for the criteria of selecting test attributes influences the classification accuracy of the tree. There exists diversity choosing standards for testing attribute based on entropy, Bayesian, and so on. In this paper, the degree of dependency of decision attribute on condition attribute, based on rough set theory, is used as a heuristic for selecting the attribute that will best separate the samples into individual classes. The results of example and experiments show that compared with the entropy-based approach, our approach is a better way to select nodes for constructing decision tree.