一种改进的多任务最小二乘双支持向量机

IF 1.2 4区 计算机科学 Q4 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Hossein Moosaei, Fatemeh Bazikar, Panos M. Pardalos
{"title":"一种改进的多任务最小二乘双支持向量机","authors":"Hossein Moosaei,&nbsp;Fatemeh Bazikar,&nbsp;Panos M. Pardalos","doi":"10.1007/s10472-023-09877-8","DOIUrl":null,"url":null,"abstract":"<div><p>In recent years, multi-task learning (MTL) has become a popular field in machine learning and has a key role in various domains. Sharing knowledge across tasks in MTL can improve the performance of learning algorithms and enhance their generalization capability. A new approach called the multi-task least squares twin support vector machine (MTLS-TSVM) was recently proposed as a least squares variant of the direct multi-task twin support vector machine (DMTSVM). Unlike DMTSVM, which solves two quadratic programming problems, MTLS-TSVM solves two linear systems of equations, resulting in a reduced computational time. In this paper, we propose an enhanced version of MTLS-TSVM called the improved multi-task least squares twin support vector machine (IMTLS-TSVM). IMTLS-TSVM offers a significant advantage over MTLS-TSVM by operating based on the empirical risk minimization principle, which allows for better generalization performance. The model achieves this by including regularization terms in its objective function, which helps control the model’s complexity and prevent overfitting. We demonstrate the effectiveness of IMTLS-TSVM by comparing it to several single-task and multi-task learning algorithms on various real-world data sets. Our results highlight the superior performance of IMTLS-TSVM in addressing multi-task learning problems.</p></div>","PeriodicalId":7971,"journal":{"name":"Annals of Mathematics and Artificial Intelligence","volume":"93 1","pages":"21 - 41"},"PeriodicalIF":1.2000,"publicationDate":"2023-07-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://link.springer.com/content/pdf/10.1007/s10472-023-09877-8.pdf","citationCount":"0","resultStr":"{\"title\":\"An improved multi-task least squares twin support vector machine\",\"authors\":\"Hossein Moosaei,&nbsp;Fatemeh Bazikar,&nbsp;Panos M. Pardalos\",\"doi\":\"10.1007/s10472-023-09877-8\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>In recent years, multi-task learning (MTL) has become a popular field in machine learning and has a key role in various domains. Sharing knowledge across tasks in MTL can improve the performance of learning algorithms and enhance their generalization capability. A new approach called the multi-task least squares twin support vector machine (MTLS-TSVM) was recently proposed as a least squares variant of the direct multi-task twin support vector machine (DMTSVM). Unlike DMTSVM, which solves two quadratic programming problems, MTLS-TSVM solves two linear systems of equations, resulting in a reduced computational time. In this paper, we propose an enhanced version of MTLS-TSVM called the improved multi-task least squares twin support vector machine (IMTLS-TSVM). IMTLS-TSVM offers a significant advantage over MTLS-TSVM by operating based on the empirical risk minimization principle, which allows for better generalization performance. The model achieves this by including regularization terms in its objective function, which helps control the model’s complexity and prevent overfitting. We demonstrate the effectiveness of IMTLS-TSVM by comparing it to several single-task and multi-task learning algorithms on various real-world data sets. Our results highlight the superior performance of IMTLS-TSVM in addressing multi-task learning problems.</p></div>\",\"PeriodicalId\":7971,\"journal\":{\"name\":\"Annals of Mathematics and Artificial Intelligence\",\"volume\":\"93 1\",\"pages\":\"21 - 41\"},\"PeriodicalIF\":1.2000,\"publicationDate\":\"2023-07-27\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://link.springer.com/content/pdf/10.1007/s10472-023-09877-8.pdf\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Annals of Mathematics and Artificial Intelligence\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://link.springer.com/article/10.1007/s10472-023-09877-8\",\"RegionNum\":4,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q4\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Annals of Mathematics and Artificial Intelligence","FirstCategoryId":"94","ListUrlMain":"https://link.springer.com/article/10.1007/s10472-023-09877-8","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

摘要

近年来,多任务学习(MTL)已成为机器学习的一个热门领域,并在各个领域发挥着关键作用。在MTL中,跨任务共享知识可以提高学习算法的性能,增强其泛化能力。作为直接多任务双支持向量机(DMTSVM)的最小二乘变体,最近提出了一种新的多任务最小二乘双支持向量机(MTLS-TSVM)方法。与DMTSVM解决两个二次规划问题不同,MTLS-TSVM解决两个线性方程组,从而减少了计算时间。在本文中,我们提出了一个增强版本的MTLS-TSVM,称为改进的多任务最小二乘双支持向量机(IMTLS-TSVM)。与MTLS-TSVM相比,IMTLS-TSVM基于经验风险最小化原则进行操作,具有显著的优势,具有更好的泛化性能。该模型通过在其目标函数中包含正则化项来实现这一目标,这有助于控制模型的复杂性并防止过拟合。我们通过将IMTLS-TSVM与几种单任务和多任务学习算法在各种真实数据集上进行比较,证明了IMTLS-TSVM的有效性。我们的研究结果突出了IMTLS-TSVM在解决多任务学习问题方面的优越性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
An improved multi-task least squares twin support vector machine

In recent years, multi-task learning (MTL) has become a popular field in machine learning and has a key role in various domains. Sharing knowledge across tasks in MTL can improve the performance of learning algorithms and enhance their generalization capability. A new approach called the multi-task least squares twin support vector machine (MTLS-TSVM) was recently proposed as a least squares variant of the direct multi-task twin support vector machine (DMTSVM). Unlike DMTSVM, which solves two quadratic programming problems, MTLS-TSVM solves two linear systems of equations, resulting in a reduced computational time. In this paper, we propose an enhanced version of MTLS-TSVM called the improved multi-task least squares twin support vector machine (IMTLS-TSVM). IMTLS-TSVM offers a significant advantage over MTLS-TSVM by operating based on the empirical risk minimization principle, which allows for better generalization performance. The model achieves this by including regularization terms in its objective function, which helps control the model’s complexity and prevent overfitting. We demonstrate the effectiveness of IMTLS-TSVM by comparing it to several single-task and multi-task learning algorithms on various real-world data sets. Our results highlight the superior performance of IMTLS-TSVM in addressing multi-task learning problems.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Annals of Mathematics and Artificial Intelligence
Annals of Mathematics and Artificial Intelligence 工程技术-计算机:人工智能
CiteScore
3.00
自引率
8.30%
发文量
37
审稿时长
>12 weeks
期刊介绍: Annals of Mathematics and Artificial Intelligence presents a range of topics of concern to scholars applying quantitative, combinatorial, logical, algebraic and algorithmic methods to diverse areas of Artificial Intelligence, from decision support, automated deduction, and reasoning, to knowledge-based systems, machine learning, computer vision, robotics and planning. The journal features collections of papers appearing either in volumes (400 pages) or in separate issues (100-300 pages), which focus on one topic and have one or more guest editors. Annals of Mathematics and Artificial Intelligence hopes to influence the spawning of new areas of applied mathematics and strengthen the scientific underpinnings of Artificial Intelligence.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信