The Model Selection for Semi-Supervised Support Vector Machines

Ying Zhao, Jianpei Zhang, Jing Yang
{"title":"The Model Selection for Semi-Supervised Support Vector Machines","authors":"Ying Zhao, Jianpei Zhang, Jing Yang","doi":"10.1109/ICICSE.2008.29","DOIUrl":null,"url":null,"abstract":"Model selection for semi-supervised support vector machine is an important step in a high-performance learning machine. It is usually done by minimizing an estimate of generalization error based on the bounds of the leave-one-out such as radius-margin bound and on the performance measures such as generalized approximate cross-validation empirical error, etc. In order to get the parameter of SVM with RBF kernel, this paper presents a linear grid search method, which combines grid search and linear search. This method can reduce the resources required both in terms of processing time and of storage space. Experiments both on artificial and real word datasets show that the proposed linear grid search has the advantage of good performance compared to using linear search alone.","PeriodicalId":333889,"journal":{"name":"2008 International Conference on Internet Computing in Science and Engineering","volume":"19 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2008-01-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2008 International Conference on Internet Computing in Science and Engineering","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICICSE.2008.29","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

Model selection for semi-supervised support vector machine is an important step in a high-performance learning machine. It is usually done by minimizing an estimate of generalization error based on the bounds of the leave-one-out such as radius-margin bound and on the performance measures such as generalized approximate cross-validation empirical error, etc. In order to get the parameter of SVM with RBF kernel, this paper presents a linear grid search method, which combines grid search and linear search. This method can reduce the resources required both in terms of processing time and of storage space. Experiments both on artificial and real word datasets show that the proposed linear grid search has the advantage of good performance compared to using linear search alone.
半监督支持向量机的模型选择
半监督支持向量机的模型选择是高性能学习机的重要步骤。它通常是通过最小化泛化误差的估计来完成的,该估计基于留一的边界(如半径边界)和性能度量(如广义近似交叉验证经验误差等)。为了利用RBF核获取支持向量机的参数,提出了一种将网格搜索和线性搜索相结合的线性网格搜索方法。这种方法可以减少处理时间和存储空间所需的资源。在人工和真实单词数据集上的实验表明,与单独使用线性搜索相比,所提出的线性网格搜索具有良好的性能优势。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信