Support Vector Machine Pre-pruning Approaches on Decision Trees for Better Classification

D. Y. Y. Sim
{"title":"Support Vector Machine Pre-pruning Approaches on Decision Trees for Better Classification","authors":"D. Y. Y. Sim","doi":"10.1145/3362752.3362763","DOIUrl":null,"url":null,"abstract":"Incorporation of the structural risk minimization of Support Vector Machine to pre-prune the decision trees based on empirical risk minimization is conducted to develop a combined algorithm. It is named as Support Vector Machine Pruned Decision Trees (SVMPDT) algorithm. Pre-pruning of decision trees (DT) is applied to the datasets through the synergistically adjusted regularization parameter of SVM. This is done by the proposed new approach derived from the study on the synergy effects between the pre-pruning weighting fraction of DT and the regularization parameter of SVM. The regularization parameter of SVM is customized and adjusted based on the different features and characteristics of DT from each applied dataset. After applying the proposed algorithms to the assigned datasets, it is shown to be more accurate in classification when compared with typical SVM without getting its parameter adjusted accordingly and the typical DT classification without applying pre-pruned weighting fraction as well as the default SVMDT algorithms without getting the DT to be pre-pruned. This is because its regularization parameter of SVM can be optimally adjusted with the newly proposed formulations on the pre-pruned weighting fraction of DT in a synergy way such that the classification accuracies can significantly be improved.","PeriodicalId":430178,"journal":{"name":"Proceedings of the 2019 2nd International Conference on Electronics and Electrical Engineering Technology","volume":"36 2","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-09-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2019 2nd International Conference on Electronics and Electrical Engineering Technology","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3362752.3362763","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

Abstract

Incorporation of the structural risk minimization of Support Vector Machine to pre-prune the decision trees based on empirical risk minimization is conducted to develop a combined algorithm. It is named as Support Vector Machine Pruned Decision Trees (SVMPDT) algorithm. Pre-pruning of decision trees (DT) is applied to the datasets through the synergistically adjusted regularization parameter of SVM. This is done by the proposed new approach derived from the study on the synergy effects between the pre-pruning weighting fraction of DT and the regularization parameter of SVM. The regularization parameter of SVM is customized and adjusted based on the different features and characteristics of DT from each applied dataset. After applying the proposed algorithms to the assigned datasets, it is shown to be more accurate in classification when compared with typical SVM without getting its parameter adjusted accordingly and the typical DT classification without applying pre-pruned weighting fraction as well as the default SVMDT algorithms without getting the DT to be pre-pruned. This is because its regularization parameter of SVM can be optimally adjusted with the newly proposed formulations on the pre-pruned weighting fraction of DT in a synergy way such that the classification accuracies can significantly be improved.
基于支持向量机的决策树预剪枝方法
将支持向量机的结构风险最小化方法引入到基于经验风险最小化的决策树预剪枝中,开发了一种组合算法。该算法被命名为支持向量机修剪决策树(SVMPDT)算法。通过协同调整支持向量机的正则化参数,对数据集进行决策树预剪枝。这是通过研究DT的预剪枝权重分数与SVM正则化参数之间的协同效应而提出的新方法来实现的。SVM的正则化参数是根据每个应用数据集DT的不同特征和特征进行定制和调整的。将本文算法应用于指定的数据集后,与不进行参数调整的典型SVM、不进行预修剪权重分数的典型DT分类以及不进行DT预修剪的默认SVMDT算法相比,其分类准确率更高。这是因为其SVM的正则化参数可以与新提出的基于DT预修剪权重分数的公式协同优化调整,从而显著提高分类精度。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信