Jingjing Tang, Ying-jie Tian, Guoqiang Wu, Dewei Li
{"title":"大规模线性非并行支持向量机的随机梯度下降","authors":"Jingjing Tang, Ying-jie Tian, Guoqiang Wu, Dewei Li","doi":"10.1145/3106426.3109427","DOIUrl":null,"url":null,"abstract":"In recent years, nonparallel support vector machine (NPSVM) is proposed as a nonparallel hyperplane classifier with superior performance than standard SVM and existing nonparallel classifiers such as the twin support vector machine (TWSVM). With the perfect theoretical underpinnings and great practical success, NPSVM has been used to dealing with the classification tasks on different scales. Tackling large-scale classification problem is a challenge yet significant work. Although large-scale linear NPSVM model has already been efficiently solved by the dual coordinate descent (DCD) algorithm or alternating direction method of multipliers (ADMM), we present a new strategy to solve the primal form of linear NPSVM different from existing work in this paper. Our algorithm is designed in the framework of the stochastic gradient descent (SGD), which is well suited to large-scale problem. Experiments are conducted on five large-scale data sets to confirm the effectiveness of our method.","PeriodicalId":20685,"journal":{"name":"Proceedings of the 7th International Conference on Web Intelligence, Mining and Semantics","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2017-08-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":"{\"title\":\"Stochastic gradient descent for large-scale linear nonparallel SVM\",\"authors\":\"Jingjing Tang, Ying-jie Tian, Guoqiang Wu, Dewei Li\",\"doi\":\"10.1145/3106426.3109427\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In recent years, nonparallel support vector machine (NPSVM) is proposed as a nonparallel hyperplane classifier with superior performance than standard SVM and existing nonparallel classifiers such as the twin support vector machine (TWSVM). With the perfect theoretical underpinnings and great practical success, NPSVM has been used to dealing with the classification tasks on different scales. Tackling large-scale classification problem is a challenge yet significant work. Although large-scale linear NPSVM model has already been efficiently solved by the dual coordinate descent (DCD) algorithm or alternating direction method of multipliers (ADMM), we present a new strategy to solve the primal form of linear NPSVM different from existing work in this paper. Our algorithm is designed in the framework of the stochastic gradient descent (SGD), which is well suited to large-scale problem. Experiments are conducted on five large-scale data sets to confirm the effectiveness of our method.\",\"PeriodicalId\":20685,\"journal\":{\"name\":\"Proceedings of the 7th International Conference on Web Intelligence, Mining and Semantics\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2017-08-23\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"4\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 7th International Conference on Web Intelligence, Mining and Semantics\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3106426.3109427\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 7th International Conference on Web Intelligence, Mining and Semantics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3106426.3109427","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Stochastic gradient descent for large-scale linear nonparallel SVM
In recent years, nonparallel support vector machine (NPSVM) is proposed as a nonparallel hyperplane classifier with superior performance than standard SVM and existing nonparallel classifiers such as the twin support vector machine (TWSVM). With the perfect theoretical underpinnings and great practical success, NPSVM has been used to dealing with the classification tasks on different scales. Tackling large-scale classification problem is a challenge yet significant work. Although large-scale linear NPSVM model has already been efficiently solved by the dual coordinate descent (DCD) algorithm or alternating direction method of multipliers (ADMM), we present a new strategy to solve the primal form of linear NPSVM different from existing work in this paper. Our algorithm is designed in the framework of the stochastic gradient descent (SGD), which is well suited to large-scale problem. Experiments are conducted on five large-scale data sets to confirm the effectiveness of our method.