通过顺序查找数据并使用纠错规则归纳决策树

N. Bathaeian, Muharram Mansoorizadeh
{"title":"通过顺序查找数据并使用纠错规则归纳决策树","authors":"N. Bathaeian, Muharram Mansoorizadeh","doi":"10.1109/IKT.2016.7777780","DOIUrl":null,"url":null,"abstract":"Decision trees are common algorithms in machine learning. Traditionally, these algorithms make trees recursively and at each step, they inspect data to induce the part of the tree. However decision trees are famous for their instability and high variance in error. In this paper a solution which adds error correction rule to a traditional decision tree algorithm is examined. In fact an algorithm which we call it, ECD3 is introduced. Algorithm of ECD3 inspects data sequentially in an iterative manner and updates tree only when it finds an erroneous observation. This method was first proposed by Dr. Utgoff but not implemented. In this paper, the method is developed and several experiments are performed to evaluate the method. We found that in most cases, performance of ECD3 is comparable to its predecessors. However ECD3 has some benefits over them. First, sizes of its trees are significantly smaller. Second, on average, variance of error in ECD3 is lower. Furthermore, ECD3 automatically chooses part of data for induction of the tree and sets aside others. This capability can be exploited for prototype selection in various learning algorithms. To explain these observations, we use inductive bias and margin definitions in our theories. We introduce a new definition of margin in ordinary decision trees based on shape, size and splitting criteria in trees. We show that how ECD3 expands the margins and enhances precision over test data.","PeriodicalId":205496,"journal":{"name":"2016 Eighth International Conference on Information and Knowledge Technology (IKT)","volume":"18 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Induction of decision trees by looking to data sequentially and using error correction rule\",\"authors\":\"N. Bathaeian, Muharram Mansoorizadeh\",\"doi\":\"10.1109/IKT.2016.7777780\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Decision trees are common algorithms in machine learning. Traditionally, these algorithms make trees recursively and at each step, they inspect data to induce the part of the tree. However decision trees are famous for their instability and high variance in error. In this paper a solution which adds error correction rule to a traditional decision tree algorithm is examined. In fact an algorithm which we call it, ECD3 is introduced. Algorithm of ECD3 inspects data sequentially in an iterative manner and updates tree only when it finds an erroneous observation. This method was first proposed by Dr. Utgoff but not implemented. In this paper, the method is developed and several experiments are performed to evaluate the method. We found that in most cases, performance of ECD3 is comparable to its predecessors. However ECD3 has some benefits over them. First, sizes of its trees are significantly smaller. Second, on average, variance of error in ECD3 is lower. Furthermore, ECD3 automatically chooses part of data for induction of the tree and sets aside others. This capability can be exploited for prototype selection in various learning algorithms. To explain these observations, we use inductive bias and margin definitions in our theories. We introduce a new definition of margin in ordinary decision trees based on shape, size and splitting criteria in trees. We show that how ECD3 expands the margins and enhances precision over test data.\",\"PeriodicalId\":205496,\"journal\":{\"name\":\"2016 Eighth International Conference on Information and Knowledge Technology (IKT)\",\"volume\":\"18 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2016-09-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2016 Eighth International Conference on Information and Knowledge Technology (IKT)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IKT.2016.7777780\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2016 Eighth International Conference on Information and Knowledge Technology (IKT)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IKT.2016.7777780","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

决策树是机器学习中常见的算法。传统上,这些算法递归地生成树,并在每一步中检查数据以诱导树的部分。然而,决策树以其不稳定性和误差方差大而闻名。本文研究了在传统的决策树算法中加入纠错规则的一种解决方案。实际上,我们引入了一种叫做ECD3的算法。ECD3算法以迭代的方式依次检查数据,只有在发现错误时才更新树。这种方法最初是由Utgoff博士提出的,但没有实施。本文开发了该方法,并进行了几个实验来评估该方法。我们发现,在大多数情况下,ECD3的性能与它的前辈相当。然而,ECD3比它们有一些好处。首先,它的树的尺寸要小得多。其次,平均而言,ECD3的误差方差更低。此外,ECD3自动选择部分数据用于树的归纳,并将其他数据放在一边。这种能力可以在各种学习算法中用于原型选择。为了解释这些观察结果,我们在理论中使用归纳偏差和边际定义。基于树的形状、大小和分割准则,提出了一种新的决策树边缘的定义。我们展示了ECD3如何扩展边界并提高测试数据的精度。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Induction of decision trees by looking to data sequentially and using error correction rule
Decision trees are common algorithms in machine learning. Traditionally, these algorithms make trees recursively and at each step, they inspect data to induce the part of the tree. However decision trees are famous for their instability and high variance in error. In this paper a solution which adds error correction rule to a traditional decision tree algorithm is examined. In fact an algorithm which we call it, ECD3 is introduced. Algorithm of ECD3 inspects data sequentially in an iterative manner and updates tree only when it finds an erroneous observation. This method was first proposed by Dr. Utgoff but not implemented. In this paper, the method is developed and several experiments are performed to evaluate the method. We found that in most cases, performance of ECD3 is comparable to its predecessors. However ECD3 has some benefits over them. First, sizes of its trees are significantly smaller. Second, on average, variance of error in ECD3 is lower. Furthermore, ECD3 automatically chooses part of data for induction of the tree and sets aside others. This capability can be exploited for prototype selection in various learning algorithms. To explain these observations, we use inductive bias and margin definitions in our theories. We introduce a new definition of margin in ordinary decision trees based on shape, size and splitting criteria in trees. We show that how ECD3 expands the margins and enhances precision over test data.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信