基于几何边界的支持向量机加速半径边界参数选择

Ben Goodrich, D. Albrecht, P. Tischer
{"title":"基于几何边界的支持向量机加速半径边界参数选择","authors":"Ben Goodrich, D. Albrecht, P. Tischer","doi":"10.1109/ICDM.2010.100","DOIUrl":null,"url":null,"abstract":"By considering the geometric properties of the Support Vector Machine (SVM) and Minimal Enclosing Ball (MEB) optimization problems, we show that upper and lower bounds on the radius-margin ratio of an SVM can be efficiently computed at any point during training. We use these bounds to accelerate radius-margin parameter selection by terminating training routines as early as possible, while still obtaining a guarantee that the parameters minimize the radius-margin ratio. Once an SVM has been partially trained on any set of parameters, we also show that these bounds can be used to evaluate and possibly reject neighboring parameter values with little or no additional training required. Empirical results show that, when selecting two parameter values, this process can reduce the number of training iterations required by a factor of 10 or more, while suffering no loss of precision in minimizing the radius-margin ratio.","PeriodicalId":294061,"journal":{"name":"2010 IEEE International Conference on Data Mining","volume":"18 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2010-12-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Accelerating Radius-Margin Parameter Selection for SVMs Using Geometric Bounds\",\"authors\":\"Ben Goodrich, D. Albrecht, P. Tischer\",\"doi\":\"10.1109/ICDM.2010.100\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"By considering the geometric properties of the Support Vector Machine (SVM) and Minimal Enclosing Ball (MEB) optimization problems, we show that upper and lower bounds on the radius-margin ratio of an SVM can be efficiently computed at any point during training. We use these bounds to accelerate radius-margin parameter selection by terminating training routines as early as possible, while still obtaining a guarantee that the parameters minimize the radius-margin ratio. Once an SVM has been partially trained on any set of parameters, we also show that these bounds can be used to evaluate and possibly reject neighboring parameter values with little or no additional training required. Empirical results show that, when selecting two parameter values, this process can reduce the number of training iterations required by a factor of 10 or more, while suffering no loss of precision in minimizing the radius-margin ratio.\",\"PeriodicalId\":294061,\"journal\":{\"name\":\"2010 IEEE International Conference on Data Mining\",\"volume\":\"18 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2010-12-13\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2010 IEEE International Conference on Data Mining\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICDM.2010.100\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2010 IEEE International Conference on Data Mining","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICDM.2010.100","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

通过考虑支持向量机(SVM)和最小封闭球(MEB)优化问题的几何性质,我们证明了支持向量机的半径-边缘比的上界和下界可以在训练过程中的任何点有效地计算出来。我们使用这些边界通过尽早终止训练例程来加速半径-边界参数的选择,同时仍然保证参数使半径-边界比最小。一旦支持向量机在任何参数集上进行了部分训练,我们还表明,这些边界可以用来评估和可能拒绝邻近的参数值,而很少或不需要额外的训练。实验结果表明,当选择两个参数值时,该过程可以将所需的训练迭代次数减少10倍或更多,同时在最小化半径-裕度比方面没有损失精度。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Accelerating Radius-Margin Parameter Selection for SVMs Using Geometric Bounds
By considering the geometric properties of the Support Vector Machine (SVM) and Minimal Enclosing Ball (MEB) optimization problems, we show that upper and lower bounds on the radius-margin ratio of an SVM can be efficiently computed at any point during training. We use these bounds to accelerate radius-margin parameter selection by terminating training routines as early as possible, while still obtaining a guarantee that the parameters minimize the radius-margin ratio. Once an SVM has been partially trained on any set of parameters, we also show that these bounds can be used to evaluate and possibly reject neighboring parameter values with little or no additional training required. Empirical results show that, when selecting two parameter values, this process can reduce the number of training iterations required by a factor of 10 or more, while suffering no loss of precision in minimizing the radius-margin ratio.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信