多学习广义低秩模型

Francois Buet-Golfouse, Parth Pahwa
{"title":"多学习广义低秩模型","authors":"Francois Buet-Golfouse, Parth Pahwa","doi":"10.1109/ICMLA55696.2022.00142","DOIUrl":null,"url":null,"abstract":"Multi-output supervised learning and multi-task learning are all instances of a broader learning paradigm where features, parameters and objectives are shared to a certain extent. Examples of such approaches include reusing features from pre-existing models in a new algorithm, performing multi-label regression or optimising for several tasks jointly. In this paper, we address this challenge by devising a generic framework based on generalised low-rank models (\"GLRMs\"), which include – broadly speaking– most techniques that can be expressed in terms of matrix factorisation. Importantly, while GLRMs first and foremost tackle unsupervised learning problems and supervised linear models. Here, we show that GLRMs can be extended by introducing multivariate functionals and structure regularisation terms to handle multivariate learning. This paper also proposes a coherent framework to design multi-learning strategies and covers existing algorithms. Finally, we prove the simplicity and effectiveness of our approach on empirical data.","PeriodicalId":128160,"journal":{"name":"2022 21st IEEE International Conference on Machine Learning and Applications (ICMLA)","volume":"81 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Multi-Learning Generalised Low-Rank Models\",\"authors\":\"Francois Buet-Golfouse, Parth Pahwa\",\"doi\":\"10.1109/ICMLA55696.2022.00142\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Multi-output supervised learning and multi-task learning are all instances of a broader learning paradigm where features, parameters and objectives are shared to a certain extent. Examples of such approaches include reusing features from pre-existing models in a new algorithm, performing multi-label regression or optimising for several tasks jointly. In this paper, we address this challenge by devising a generic framework based on generalised low-rank models (\\\"GLRMs\\\"), which include – broadly speaking– most techniques that can be expressed in terms of matrix factorisation. Importantly, while GLRMs first and foremost tackle unsupervised learning problems and supervised linear models. Here, we show that GLRMs can be extended by introducing multivariate functionals and structure regularisation terms to handle multivariate learning. This paper also proposes a coherent framework to design multi-learning strategies and covers existing algorithms. Finally, we prove the simplicity and effectiveness of our approach on empirical data.\",\"PeriodicalId\":128160,\"journal\":{\"name\":\"2022 21st IEEE International Conference on Machine Learning and Applications (ICMLA)\",\"volume\":\"81 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-12-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 21st IEEE International Conference on Machine Learning and Applications (ICMLA)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICMLA55696.2022.00142\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 21st IEEE International Conference on Machine Learning and Applications (ICMLA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICMLA55696.2022.00142","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

多输出监督学习和多任务学习都是更广泛的学习范式的实例,其中特征、参数和目标在一定程度上是共享的。这些方法的例子包括在新算法中重用已有模型的特征,执行多标签回归或对多个任务进行联合优化。在本文中,我们通过设计一个基于广义低秩模型(“glrm”)的通用框架来解决这一挑战,广义上说,它包括了大多数可以用矩阵分解来表达的技术。重要的是,虽然glrm首先解决无监督学习问题和有监督线性模型。在这里,我们展示了可以通过引入多元函数和结构正则化术语来扩展glrm来处理多元学习。本文还提出了一个连贯的框架来设计多学习策略并涵盖了现有的算法。最后,用实证数据证明了本文方法的简单性和有效性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Multi-Learning Generalised Low-Rank Models
Multi-output supervised learning and multi-task learning are all instances of a broader learning paradigm where features, parameters and objectives are shared to a certain extent. Examples of such approaches include reusing features from pre-existing models in a new algorithm, performing multi-label regression or optimising for several tasks jointly. In this paper, we address this challenge by devising a generic framework based on generalised low-rank models ("GLRMs"), which include – broadly speaking– most techniques that can be expressed in terms of matrix factorisation. Importantly, while GLRMs first and foremost tackle unsupervised learning problems and supervised linear models. Here, we show that GLRMs can be extended by introducing multivariate functionals and structure regularisation terms to handle multivariate learning. This paper also proposes a coherent framework to design multi-learning strategies and covers existing algorithms. Finally, we prove the simplicity and effectiveness of our approach on empirical data.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信