On Fast SVM Algorithms Used For Pattern Recognition

Felipe A. C. de Bastos, M. Campos
{"title":"On Fast SVM Algorithms Used For Pattern Recognition","authors":"Felipe A. C. de Bastos, M. Campos","doi":"10.21528/LNLM-vol4-no2-art1","DOIUrl":null,"url":null,"abstract":"This tutorial on fast Support Vector Machines (SVM) presents mathematical formulations and pseudocode Implementations of three algorithms used for fast SVM training. Traditional SVM training is a quadraticprogramming (QP) minimization problem that can be solved, e.g., using the Sequential Minimization Optimization (SMO) algorithm. This algorithm solves analytically a small QP optimization problem in each iteration, drastically reducing the training time needed by conventional QP optimizers. It is important to note that traditional SVM can be of two types: L1SVM and L2SVM, depending on the way that the training error is characterized in the SVM mathematical formulation. The SMO implementation presented in this tutorial applies only for the L1SVM, but it can be adapted to the L2SVM case. The Proximal SVM (PSVM) algorithm was also introduced as a fast alternative to traditional SVM classifiers that usually require a large amount of computation time for training. Unfortunately the PSVM algorithm may present poor performance due to biased optimal hyperplanes. The Unbiased Proximal SVM (UPSVM) algorithm uses a slightly different approach to circumvent this problem, such that an unbiased optimal hyperplane is always obtained. The results obtained show that the UPSVM algorithm performs better than the Sequential Minimal Optimization (SMO) algorithm with respect to training time with similar or better probability of correct pattern classification. The UPSVM algorithm also performs better than the PSVM algorithm with respect to probability of correct pattern classification (especially for low values of the regularization parameter C ), to training time, and to the number of floating point operations.","PeriodicalId":386768,"journal":{"name":"Learning and Nonlinear Models","volume":"13 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Learning and Nonlinear Models","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.21528/LNLM-vol4-no2-art1","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

This tutorial on fast Support Vector Machines (SVM) presents mathematical formulations and pseudocode Implementations of three algorithms used for fast SVM training. Traditional SVM training is a quadraticprogramming (QP) minimization problem that can be solved, e.g., using the Sequential Minimization Optimization (SMO) algorithm. This algorithm solves analytically a small QP optimization problem in each iteration, drastically reducing the training time needed by conventional QP optimizers. It is important to note that traditional SVM can be of two types: L1SVM and L2SVM, depending on the way that the training error is characterized in the SVM mathematical formulation. The SMO implementation presented in this tutorial applies only for the L1SVM, but it can be adapted to the L2SVM case. The Proximal SVM (PSVM) algorithm was also introduced as a fast alternative to traditional SVM classifiers that usually require a large amount of computation time for training. Unfortunately the PSVM algorithm may present poor performance due to biased optimal hyperplanes. The Unbiased Proximal SVM (UPSVM) algorithm uses a slightly different approach to circumvent this problem, such that an unbiased optimal hyperplane is always obtained. The results obtained show that the UPSVM algorithm performs better than the Sequential Minimal Optimization (SMO) algorithm with respect to training time with similar or better probability of correct pattern classification. The UPSVM algorithm also performs better than the PSVM algorithm with respect to probability of correct pattern classification (especially for low values of the regularization parameter C ), to training time, and to the number of floating point operations.
用于模式识别的快速SVM算法研究
本教程介绍了用于快速支持向量机训练的三种算法的数学公式和伪代码实现。传统的支持向量机训练是一个可以求解的二次规划(QP)最小化问题,例如使用顺序最小化优化(SMO)算法。该算法在每次迭代中解析地解决了一个小的QP优化问题,大大减少了传统QP优化器所需的训练时间。值得注意的是,传统的支持向量机可以分为两种类型:L1SVM和L2SVM,这取决于SVM数学公式中训练误差的表征方式。本教程中介绍的SMO实现仅适用于L1SVM,但它也可以适用于L2SVM。本文还引入了近端支持向量机(PSVM)算法,作为传统支持向量机分类器的一种快速替代算法,该算法通常需要大量的计算时间进行训练。不幸的是,由于偏优超平面,PSVM算法可能会表现出较差的性能。无偏近邻支持向量机(UPSVM)算法使用一种稍微不同的方法来规避这个问题,使得总是得到一个无偏最优超平面。结果表明,UPSVM算法在训练时间上优于序贯最小优化算法(SMO),且正确模式分类的概率相似或更高。UPSVM算法在模式正确分类的概率(特别是在正则化参数C值较低时)、训练时间和浮点运算次数方面也优于PSVM算法。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信