When Multitask Learning Meets Partial Supervision: A Computer Vision Review

IF 23.2 1区 计算机科学 Q1 ENGINEERING, ELECTRICAL & ELECTRONIC
Maxime Fontana;Michael Spratling;Miaojing Shi
{"title":"When Multitask Learning Meets Partial Supervision: A Computer Vision Review","authors":"Maxime Fontana;Michael Spratling;Miaojing Shi","doi":"10.1109/JPROC.2024.3435012","DOIUrl":null,"url":null,"abstract":"Multitask learning (MTL) aims to learn multiple tasks simultaneously while exploiting their mutual relationships. By using shared resources to simultaneously calculate multiple outputs, this learning paradigm has the potential to have lower memory requirements and inference times compared to the traditional approach of using separate methods for each task. Previous work in MTL has mainly focused on fully supervised methods, as task relationships (TRs) can not only be leveraged to lower the level of data dependency of those methods but also improve the performance. However, MTL introduces a set of challenges due to a complex optimization scheme and a higher labeling requirement. This article focuses on how MTL could be utilized under different partial supervision settings to address these challenges. First, this article analyses how MTL traditionally uses different parameter sharing techniques to transfer knowledge in between tasks. Second, it presents different challenges arising from such a multiobjective optimization (MOO) scheme. Third, it introduces how task groupings (TGs) can be achieved by analyzing TRs. Fourth, it focuses on how partially supervised methods applied to MTL can tackle the aforementioned challenges. Lastly, this article presents the available datasets, tools, and benchmarking results of such methods. The reviewed articles, categorized following this work, are available at \n<uri>https://github.com/Klodivio355/MTL-CV-Review</uri>\n.","PeriodicalId":20556,"journal":{"name":"Proceedings of the IEEE","volume":"112 6","pages":"516-543"},"PeriodicalIF":23.2000,"publicationDate":"2024-08-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the IEEE","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10628096/","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0

Abstract

Multitask learning (MTL) aims to learn multiple tasks simultaneously while exploiting their mutual relationships. By using shared resources to simultaneously calculate multiple outputs, this learning paradigm has the potential to have lower memory requirements and inference times compared to the traditional approach of using separate methods for each task. Previous work in MTL has mainly focused on fully supervised methods, as task relationships (TRs) can not only be leveraged to lower the level of data dependency of those methods but also improve the performance. However, MTL introduces a set of challenges due to a complex optimization scheme and a higher labeling requirement. This article focuses on how MTL could be utilized under different partial supervision settings to address these challenges. First, this article analyses how MTL traditionally uses different parameter sharing techniques to transfer knowledge in between tasks. Second, it presents different challenges arising from such a multiobjective optimization (MOO) scheme. Third, it introduces how task groupings (TGs) can be achieved by analyzing TRs. Fourth, it focuses on how partially supervised methods applied to MTL can tackle the aforementioned challenges. Lastly, this article presents the available datasets, tools, and benchmarking results of such methods. The reviewed articles, categorized following this work, are available at https://github.com/Klodivio355/MTL-CV-Review .
当多任务学习遇到部分监督:计算机视觉回顾
多任务学习(MTL)旨在同时学习多个任务,同时利用它们之间的相互关系。通过使用共享资源同时计算多个输出,这种学习范式有可能比针对每个任务使用单独方法的传统方法具有更低的内存要求和推理时间。以往的 MTL 工作主要集中在完全监督方法上,因为任务关系(TR)不仅可以用来降低这些方法的数据依赖程度,还能提高性能。然而,由于复杂的优化方案和更高的标记要求,MTL 引入了一系列挑战。本文将重点讨论如何在不同的部分监督设置下利用 MTL 来应对这些挑战。首先,本文分析了 MTL 传统上如何使用不同的参数共享技术在任务间传递知识。其次,文章介绍了这种多目标优化(MOO)方案带来的不同挑战。第三,介绍如何通过分析 TR 实现任务分组(TG)。第四,文章重点介绍了应用于 MTL 的部分监督方法如何应对上述挑战。最后,本文介绍了此类方法的可用数据集、工具和基准测试结果。按照本作品分类的综述文章可在 https://github.com/Klodivio355/MTL-CV-Review 上查阅。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Proceedings of the IEEE
Proceedings of the IEEE 工程技术-工程:电子与电气
CiteScore
46.40
自引率
1.00%
发文量
160
审稿时长
3-8 weeks
期刊介绍: Proceedings of the IEEE is the leading journal to provide in-depth review, survey, and tutorial coverage of the technical developments in electronics, electrical and computer engineering, and computer science. Consistently ranked as one of the top journals by Impact Factor, Article Influence Score and more, the journal serves as a trusted resource for engineers around the world.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信