通过IDFID3比较模糊决策树归纳的不同停止准则

M. Zeinalkhani, M. Eftekhari
{"title":"通过IDFID3比较模糊决策树归纳的不同停止准则","authors":"M. Zeinalkhani, M. Eftekhari","doi":"10.22111/IJFS.2014.1394","DOIUrl":null,"url":null,"abstract":"Fuzzy Decision Tree (FDT) classifiers combine decision trees with approximate reasoning offered by fuzzy representation to deal with language and measurement uncertainties. When a FDT induction algorithm utilizes stopping criteria for early stopping of the tree's growth, threshold values of stopping criteria will control the number of nodes. Finding a proper threshold value for a stopping criterion is one of the greatest challenges to be faced in FDT induction. In this paper, we propose a new method named Iterative Deepening Fuzzy ID3 (IDFID3) for FDT induction that has the ability of controlling the tree’s growth via dynamically setting the threshold value of stopping criterion in an iterative procedure. The final FDT induced by IDFID3 and the one obtained by common FID3 are the same when the numbers of nodes of induced FDTs are equal, but our main intention for introducing IDFID3 is the comparison of different stopping criteria through this algorithm. Therefore, a new stopping criterion named Normalized Maximum fuzzy information Gain multiplied by Number of Instances (NMGNI) is proposed and IDFID3 is used for comparing it against the other stopping criteria. Generally speaking, this paper presents a method to compare different stopping criteria independent of their threshold values utilizing IDFID3. The comparison results show that FDTs induced by the proposed stopping criterion in most situations are superior to the others and number of instances stopping criterion performs better than fuzzy information gain stopping criterion in terms of complexity (i.e. number of nodes) and classification accuracy. Also, both tree depth and fuzzy information gain stopping criteria, outperform fuzzy entropy, accuracy and number of instances in terms of mean depth of generated FDTs.","PeriodicalId":212493,"journal":{"name":"How Fuzzy Concepts Contribute to Machine Learning","volume":"31 ","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2014-02-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"Comparing Different Stopping Criteria for Fuzzy Decision Tree Induction Through IDFID3\",\"authors\":\"M. Zeinalkhani, M. Eftekhari\",\"doi\":\"10.22111/IJFS.2014.1394\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Fuzzy Decision Tree (FDT) classifiers combine decision trees with approximate reasoning offered by fuzzy representation to deal with language and measurement uncertainties. When a FDT induction algorithm utilizes stopping criteria for early stopping of the tree's growth, threshold values of stopping criteria will control the number of nodes. Finding a proper threshold value for a stopping criterion is one of the greatest challenges to be faced in FDT induction. In this paper, we propose a new method named Iterative Deepening Fuzzy ID3 (IDFID3) for FDT induction that has the ability of controlling the tree’s growth via dynamically setting the threshold value of stopping criterion in an iterative procedure. The final FDT induced by IDFID3 and the one obtained by common FID3 are the same when the numbers of nodes of induced FDTs are equal, but our main intention for introducing IDFID3 is the comparison of different stopping criteria through this algorithm. Therefore, a new stopping criterion named Normalized Maximum fuzzy information Gain multiplied by Number of Instances (NMGNI) is proposed and IDFID3 is used for comparing it against the other stopping criteria. Generally speaking, this paper presents a method to compare different stopping criteria independent of their threshold values utilizing IDFID3. The comparison results show that FDTs induced by the proposed stopping criterion in most situations are superior to the others and number of instances stopping criterion performs better than fuzzy information gain stopping criterion in terms of complexity (i.e. number of nodes) and classification accuracy. Also, both tree depth and fuzzy information gain stopping criteria, outperform fuzzy entropy, accuracy and number of instances in terms of mean depth of generated FDTs.\",\"PeriodicalId\":212493,\"journal\":{\"name\":\"How Fuzzy Concepts Contribute to Machine Learning\",\"volume\":\"31 \",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2014-02-25\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"How Fuzzy Concepts Contribute to Machine Learning\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.22111/IJFS.2014.1394\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"How Fuzzy Concepts Contribute to Machine Learning","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.22111/IJFS.2014.1394","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

摘要

模糊决策树分类器将决策树与模糊表示提供的近似推理相结合,以处理语言和测量的不确定性。当FDT诱导算法利用停止准则提前停止树的生长时,停止准则的阈值将控制节点的数量。为停止准则寻找合适的阈值是FDT诱导中面临的最大挑战之一。本文提出了一种新的FDT诱导方法,即迭代深化模糊ID3 (IDFID3),该方法通过在迭代过程中动态设置停止准则的阈值来控制树的生长。当诱导FDT的节点数相等时,IDFID3诱导的最终FDT与普通FID3获得的最终FDT是相同的,但我们引入IDFID3的主要目的是通过该算法比较不同的停止准则。为此,提出了一种新的停止准则——归一化最大模糊信息增益乘以实例数(NMGNI),并使用IDFID3与其他停止准则进行比较。一般来说,本文提出了一种利用IDFID3对不同停止准则进行独立于阈值的比较的方法。对比结果表明,在大多数情况下,所提出的停止准则所引起的fdt优于其他停止准则,并且在复杂度(即节点数)和分类精度方面,实例数停止准则优于模糊信息增益停止准则。此外,树深度和模糊信息增益停止标准在生成fdt的平均深度方面都优于模糊熵、精度和实例数。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Comparing Different Stopping Criteria for Fuzzy Decision Tree Induction Through IDFID3
Fuzzy Decision Tree (FDT) classifiers combine decision trees with approximate reasoning offered by fuzzy representation to deal with language and measurement uncertainties. When a FDT induction algorithm utilizes stopping criteria for early stopping of the tree's growth, threshold values of stopping criteria will control the number of nodes. Finding a proper threshold value for a stopping criterion is one of the greatest challenges to be faced in FDT induction. In this paper, we propose a new method named Iterative Deepening Fuzzy ID3 (IDFID3) for FDT induction that has the ability of controlling the tree’s growth via dynamically setting the threshold value of stopping criterion in an iterative procedure. The final FDT induced by IDFID3 and the one obtained by common FID3 are the same when the numbers of nodes of induced FDTs are equal, but our main intention for introducing IDFID3 is the comparison of different stopping criteria through this algorithm. Therefore, a new stopping criterion named Normalized Maximum fuzzy information Gain multiplied by Number of Instances (NMGNI) is proposed and IDFID3 is used for comparing it against the other stopping criteria. Generally speaking, this paper presents a method to compare different stopping criteria independent of their threshold values utilizing IDFID3. The comparison results show that FDTs induced by the proposed stopping criterion in most situations are superior to the others and number of instances stopping criterion performs better than fuzzy information gain stopping criterion in terms of complexity (i.e. number of nodes) and classification accuracy. Also, both tree depth and fuzzy information gain stopping criteria, outperform fuzzy entropy, accuracy and number of instances in terms of mean depth of generated FDTs.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信