Stochastic gradient descent for large-scale linear nonparallel SVM

Jingjing Tang, Ying-jie Tian, Guoqiang Wu, Dewei Li
{"title":"Stochastic gradient descent for large-scale linear nonparallel SVM","authors":"Jingjing Tang, Ying-jie Tian, Guoqiang Wu, Dewei Li","doi":"10.1145/3106426.3109427","DOIUrl":null,"url":null,"abstract":"In recent years, nonparallel support vector machine (NPSVM) is proposed as a nonparallel hyperplane classifier with superior performance than standard SVM and existing nonparallel classifiers such as the twin support vector machine (TWSVM). With the perfect theoretical underpinnings and great practical success, NPSVM has been used to dealing with the classification tasks on different scales. Tackling large-scale classification problem is a challenge yet significant work. Although large-scale linear NPSVM model has already been efficiently solved by the dual coordinate descent (DCD) algorithm or alternating direction method of multipliers (ADMM), we present a new strategy to solve the primal form of linear NPSVM different from existing work in this paper. Our algorithm is designed in the framework of the stochastic gradient descent (SGD), which is well suited to large-scale problem. Experiments are conducted on five large-scale data sets to confirm the effectiveness of our method.","PeriodicalId":20685,"journal":{"name":"Proceedings of the 7th International Conference on Web Intelligence, Mining and Semantics","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2017-08-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 7th International Conference on Web Intelligence, Mining and Semantics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3106426.3109427","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4

Abstract

In recent years, nonparallel support vector machine (NPSVM) is proposed as a nonparallel hyperplane classifier with superior performance than standard SVM and existing nonparallel classifiers such as the twin support vector machine (TWSVM). With the perfect theoretical underpinnings and great practical success, NPSVM has been used to dealing with the classification tasks on different scales. Tackling large-scale classification problem is a challenge yet significant work. Although large-scale linear NPSVM model has already been efficiently solved by the dual coordinate descent (DCD) algorithm or alternating direction method of multipliers (ADMM), we present a new strategy to solve the primal form of linear NPSVM different from existing work in this paper. Our algorithm is designed in the framework of the stochastic gradient descent (SGD), which is well suited to large-scale problem. Experiments are conducted on five large-scale data sets to confirm the effectiveness of our method.
大规模线性非并行支持向量机的随机梯度下降
近年来,非并行支持向量机(NPSVM)作为一种非并行超平面分类器被提出,其性能优于标准支持向量机和现有的双支持向量机(TWSVM)等非并行分类器。NPSVM具有完善的理论基础和巨大的实践成功,已被用于处理不同尺度的分类任务。解决大规模分类问题是一项具有挑战性但意义重大的工作。虽然大规模线性NPSVM模型已经通过对偶坐标下降(DCD)算法或乘法器交替方向法(ADMM)得到了有效的求解,但本文提出了一种不同于现有工作的求解线性NPSVM原始形式的新策略。我们的算法是在随机梯度下降(SGD)框架下设计的,它非常适合于大规模问题。在5个大规模数据集上进行了实验,验证了该方法的有效性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信