{"title":"自适应图拉普拉斯 MTL L1、L2 和 LS-SVM","authors":"Carlos Ruiz, Carlos M Alaíz, José R Dorronsoro","doi":"10.1093/jigpal/jzae025","DOIUrl":null,"url":null,"abstract":"Multi-Task Learning tries to improve the learning process of different tasks by solving them simultaneously. A popular Multi-Task Learning formulation for SVM is to combine common and task-specific parts. Other approaches rely on using a Graph Laplacian regularizer. Here we propose a combination of these two approaches that can be applied to L1, L2 and LS-SVMs. We also propose an algorithm to iteratively learn the graph adjacency matrix used in the Laplacian regularization. We test our proposal with synthetic and real problems, both in regression and classification settings. When the task structure is present, we show that our model is able to detect it, which leads to better results, and we also show it to be competitive even when this structure is not present.","PeriodicalId":0,"journal":{"name":"","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2024-03-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Adaptive graph Laplacian MTL L1, L2 and LS-SVMs\",\"authors\":\"Carlos Ruiz, Carlos M Alaíz, José R Dorronsoro\",\"doi\":\"10.1093/jigpal/jzae025\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Multi-Task Learning tries to improve the learning process of different tasks by solving them simultaneously. A popular Multi-Task Learning formulation for SVM is to combine common and task-specific parts. Other approaches rely on using a Graph Laplacian regularizer. Here we propose a combination of these two approaches that can be applied to L1, L2 and LS-SVMs. We also propose an algorithm to iteratively learn the graph adjacency matrix used in the Laplacian regularization. We test our proposal with synthetic and real problems, both in regression and classification settings. When the task structure is present, we show that our model is able to detect it, which leads to better results, and we also show it to be competitive even when this structure is not present.\",\"PeriodicalId\":0,\"journal\":{\"name\":\"\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0,\"publicationDate\":\"2024-03-24\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://doi.org/10.1093/jigpal/jzae025\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1093/jigpal/jzae025","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Multi-Task Learning tries to improve the learning process of different tasks by solving them simultaneously. A popular Multi-Task Learning formulation for SVM is to combine common and task-specific parts. Other approaches rely on using a Graph Laplacian regularizer. Here we propose a combination of these two approaches that can be applied to L1, L2 and LS-SVMs. We also propose an algorithm to iteratively learn the graph adjacency matrix used in the Laplacian regularization. We test our proposal with synthetic and real problems, both in regression and classification settings. When the task structure is present, we show that our model is able to detect it, which leads to better results, and we also show it to be competitive even when this structure is not present.