Optimizing SVR Hyperparameters via Fast Cross-Validation using AOSVR

Masayuki Karasuyama, R. Nakano
{"title":"Optimizing SVR Hyperparameters via Fast Cross-Validation using AOSVR","authors":"Masayuki Karasuyama, R. Nakano","doi":"10.1109/IJCNN.2007.4371126","DOIUrl":null,"url":null,"abstract":"The performance of support vector regression (SVR) deeply depends on its hyperparameters such as an insensitive zone thickness, a penalty factor, and kernel parameters. A method called MCV-SVR was once proposed, which optimizes SVR hyperparameters so that cross-validation error is minimized. However, the computational cost of CV is usually high. In this paper we apply accurate online support vector regression (AOSVR) to the MCV-SVR cross-validation procedure. The AOSVR enables an efficient update of a trained SVR function when a sample is removed from training data. We show the AOSVR dramatically accelerates the MCV-SVR. Moreover, our experiments using real-world data showed our faster MCV-SVR has better generalization than other existing methods such as Bayesian SVR or practical setting.","PeriodicalId":350091,"journal":{"name":"2007 International Joint Conference on Neural Networks","volume":"31 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2007-10-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"10","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2007 International Joint Conference on Neural Networks","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IJCNN.2007.4371126","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 10

Abstract

The performance of support vector regression (SVR) deeply depends on its hyperparameters such as an insensitive zone thickness, a penalty factor, and kernel parameters. A method called MCV-SVR was once proposed, which optimizes SVR hyperparameters so that cross-validation error is minimized. However, the computational cost of CV is usually high. In this paper we apply accurate online support vector regression (AOSVR) to the MCV-SVR cross-validation procedure. The AOSVR enables an efficient update of a trained SVR function when a sample is removed from training data. We show the AOSVR dramatically accelerates the MCV-SVR. Moreover, our experiments using real-world data showed our faster MCV-SVR has better generalization than other existing methods such as Bayesian SVR or practical setting.
基于AOSVR的SVR超参数快速交叉验证优化
支持向量回归(SVR)的性能很大程度上取决于它的超参数,如不敏感区厚度、惩罚因子和核参数。曾提出一种称为MCV-SVR的方法,对SVR超参数进行优化,使交叉验证误差最小化。然而,CV的计算成本通常很高。本文将精确在线支持向量回归(AOSVR)应用于MCV-SVR交叉验证过程。当样本从训练数据中移除时,AOSVR能够有效地更新训练后的SVR函数。我们展示了AOSVR显著加速了MCV-SVR。此外,我们使用实际数据进行的实验表明,我们更快的MCV-SVR比其他现有方法(如贝叶斯SVR或实际设置)具有更好的泛化能力。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信