具有广义休伯损失的多任务支持向量机分类器

IF 1.8 4区 计算机科学 Q2 MATHEMATICS, INTERDISCIPLINARY APPLICATIONS
Qi Liu, Wenxin Zhu, Zhengming Dai, Zhihong Ma
{"title":"具有广义休伯损失的多任务支持向量机分类器","authors":"Qi Liu, Wenxin Zhu, Zhengming Dai, Zhihong Ma","doi":"10.1007/s00357-024-09488-w","DOIUrl":null,"url":null,"abstract":"<p>Compared to single-task learning (STL), multi-task learning (MTL) achieves a better generalization by exploiting domain-specific information implicit in the training signals of several related tasks. The adaptation of MTL to support vector machines (SVMs) is a rather successful example. Inspired by the recently published generalized Huber loss SVM (GHSVM) and regularized multi-task learning (RMTL), we propose a novel generalized Huber loss multi-task support vector machine including linear and non-linear cases for binary classification, named as MTL-GHSVM. The new method extends the GHSVM from single-task to multi-task learning, and the application of Huber loss to MTL-SVM is innovative to the best of our knowledge. The proposed method has two main advantages: on the one hand, compared with SVMs with hinge loss and GHSVM, our MTL-GHSVM using the differentiable generalized Huber loss has better generalization performance; on the other hand, it adopts functional iteration to find the optimal solution, and does not need to solve a quadratic programming problem (QPP), which can significantly reduce the computational cost. Numerical experiments have been conducted on fifteen real datasets, and the results demonstrate the effectiveness of the proposed multi-task classification algorithm compared with the state-of-the-art algorithms.</p>","PeriodicalId":50241,"journal":{"name":"Journal of Classification","volume":"166 1","pages":""},"PeriodicalIF":1.8000,"publicationDate":"2024-08-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Multi-task Support Vector Machine Classifier with Generalized Huber Loss\",\"authors\":\"Qi Liu, Wenxin Zhu, Zhengming Dai, Zhihong Ma\",\"doi\":\"10.1007/s00357-024-09488-w\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>Compared to single-task learning (STL), multi-task learning (MTL) achieves a better generalization by exploiting domain-specific information implicit in the training signals of several related tasks. The adaptation of MTL to support vector machines (SVMs) is a rather successful example. Inspired by the recently published generalized Huber loss SVM (GHSVM) and regularized multi-task learning (RMTL), we propose a novel generalized Huber loss multi-task support vector machine including linear and non-linear cases for binary classification, named as MTL-GHSVM. The new method extends the GHSVM from single-task to multi-task learning, and the application of Huber loss to MTL-SVM is innovative to the best of our knowledge. The proposed method has two main advantages: on the one hand, compared with SVMs with hinge loss and GHSVM, our MTL-GHSVM using the differentiable generalized Huber loss has better generalization performance; on the other hand, it adopts functional iteration to find the optimal solution, and does not need to solve a quadratic programming problem (QPP), which can significantly reduce the computational cost. Numerical experiments have been conducted on fifteen real datasets, and the results demonstrate the effectiveness of the proposed multi-task classification algorithm compared with the state-of-the-art algorithms.</p>\",\"PeriodicalId\":50241,\"journal\":{\"name\":\"Journal of Classification\",\"volume\":\"166 1\",\"pages\":\"\"},\"PeriodicalIF\":1.8000,\"publicationDate\":\"2024-08-23\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Classification\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://doi.org/10.1007/s00357-024-09488-w\",\"RegionNum\":4,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"MATHEMATICS, INTERDISCIPLINARY APPLICATIONS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Classification","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1007/s00357-024-09488-w","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"MATHEMATICS, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
引用次数: 0

摘要

与单任务学习(STL)相比,多任务学习(MTL)通过利用多个相关任务的训练信号中隐含的特定领域信息,实现了更好的泛化效果。将 MTL 应用于支持向量机(SVM)就是一个相当成功的例子。受最近发布的广义胡伯损失 SVM(GHSVM)和正则化多任务学习(RMTL)的启发,我们提出了一种新的广义胡伯损失多任务支持向量机,包括线性和非线性二元分类情况,命名为 MTL-GHSVM。新方法将 GHSVM 从单任务学习扩展到了多任务学习,而且据我们所知,将 Huber 损失应用于 MTL-SVM 是一项创新。所提出的方法有两大优势:一方面,与带铰链损失的 SVM 和 GHSVM 相比,我们使用可微分广义 Huber 损失的 MTL-GHSVM 具有更好的泛化性能;另一方面,它采用函数迭代寻找最优解,不需要求解二次编程问题(QPP),可以显著降低计算成本。我们在 15 个真实数据集上进行了数值实验,结果表明,与最先进的算法相比,所提出的多任务分类算法非常有效。
本文章由计算机程序翻译,如有差异,请以英文原文为准。

Multi-task Support Vector Machine Classifier with Generalized Huber Loss

Multi-task Support Vector Machine Classifier with Generalized Huber Loss

Compared to single-task learning (STL), multi-task learning (MTL) achieves a better generalization by exploiting domain-specific information implicit in the training signals of several related tasks. The adaptation of MTL to support vector machines (SVMs) is a rather successful example. Inspired by the recently published generalized Huber loss SVM (GHSVM) and regularized multi-task learning (RMTL), we propose a novel generalized Huber loss multi-task support vector machine including linear and non-linear cases for binary classification, named as MTL-GHSVM. The new method extends the GHSVM from single-task to multi-task learning, and the application of Huber loss to MTL-SVM is innovative to the best of our knowledge. The proposed method has two main advantages: on the one hand, compared with SVMs with hinge loss and GHSVM, our MTL-GHSVM using the differentiable generalized Huber loss has better generalization performance; on the other hand, it adopts functional iteration to find the optimal solution, and does not need to solve a quadratic programming problem (QPP), which can significantly reduce the computational cost. Numerical experiments have been conducted on fifteen real datasets, and the results demonstrate the effectiveness of the proposed multi-task classification algorithm compared with the state-of-the-art algorithms.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Journal of Classification
Journal of Classification 数学-数学跨学科应用
CiteScore
3.60
自引率
5.00%
发文量
16
审稿时长
>12 weeks
期刊介绍: To publish original and valuable papers in the field of classification, numerical taxonomy, multidimensional scaling and other ordination techniques, clustering, tree structures and other network models (with somewhat less emphasis on principal components analysis, factor analysis, and discriminant analysis), as well as associated models and algorithms for fitting them. Articles will support advances in methodology while demonstrating compelling substantive applications. Comprehensive review articles are also acceptable. Contributions will represent disciplines such as statistics, psychology, biology, information retrieval, anthropology, archeology, astronomy, business, chemistry, computer science, economics, engineering, geography, geology, linguistics, marketing, mathematics, medicine, political science, psychiatry, sociology, and soil science.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信