{"title":"Support Vector Machines","authors":"Po-Wei Wang, Chih-Jen Lin","doi":"10.1201/b17320-8","DOIUrl":null,"url":null,"abstract":"The original SVM algorithm was invented by Vladimir N. Vapnik and the current standard incarnation (soft margin) was proposed by Corinna Cortes and Vapnik in 1993 and published in 1995. A support vector machine(SVM) constructs a hyperplane or set of hyperplanes in a highor infinitedimensional space, which can be used for classification, regression, or other tasks. Intuitively, a good separation is achieved by the hyperplane that has the largest distance to the nearest training data point of any class (so-called functional margin), since in general the larger the margin the lower the generalization error of the classifier. In this notes, we will explain the intuition and then get the primal problem, and how to translate the primal problem to dual problem. We will apply kernel trick and SMO algorithms to solve the dual problem and get the hyperplane we want to separate the dataset. Give general idea about SVM and introduce the goal of this notes, what kind of problems and knowledge will be covered by this node. In this note, one single SVM model is for two labels classification, whose label is y ∈ {−1, 1}. And the hyperplane we want to find to separate the two classes dataset is h, for which classifier, we use parameters w, b and we write our classifier as hw,b(x) = g(w x+ b) Here, g(z) = 1 if z ≥ 0, and g(z) = −1 otherwise.","PeriodicalId":378937,"journal":{"name":"Data Classification: Algorithms and Applications","volume":"7 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"12","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Data Classification: Algorithms and Applications","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1201/b17320-8","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 12
Abstract
The original SVM algorithm was invented by Vladimir N. Vapnik and the current standard incarnation (soft margin) was proposed by Corinna Cortes and Vapnik in 1993 and published in 1995. A support vector machine(SVM) constructs a hyperplane or set of hyperplanes in a highor infinitedimensional space, which can be used for classification, regression, or other tasks. Intuitively, a good separation is achieved by the hyperplane that has the largest distance to the nearest training data point of any class (so-called functional margin), since in general the larger the margin the lower the generalization error of the classifier. In this notes, we will explain the intuition and then get the primal problem, and how to translate the primal problem to dual problem. We will apply kernel trick and SMO algorithms to solve the dual problem and get the hyperplane we want to separate the dataset. Give general idea about SVM and introduce the goal of this notes, what kind of problems and knowledge will be covered by this node. In this note, one single SVM model is for two labels classification, whose label is y ∈ {−1, 1}. And the hyperplane we want to find to separate the two classes dataset is h, for which classifier, we use parameters w, b and we write our classifier as hw,b(x) = g(w x+ b) Here, g(z) = 1 if z ≥ 0, and g(z) = −1 otherwise.