Fuping Wu;Le Zhang;Yang Sun;Yuanhan Mo;Thomas E. Nichols;Bartłomiej W. Papież
{"title":"MT-CooL: Multi-Task Cooperative Learning via Flat Minima Searching","authors":"Fuping Wu;Le Zhang;Yang Sun;Yuanhan Mo;Thomas E. Nichols;Bartłomiej W. Papież","doi":"10.1109/TMI.2024.3512173","DOIUrl":null,"url":null,"abstract":"While multi-task learning (MTL) has been widely developed for natural image analysis, its potential for enhancing performance in medical imaging remains relatively unexplored. Most methods formulate MTL as a multi-objective problem, inherently forcing all tasks to compete with each other during optimization. In this work, we propose a novel approach by formulating MTL as a multi-level optimization problem, in which the features learned from one task are optimized by benefiting from the other tasks. Specifically, we advocate for a cooperative approach where each task considers the features of others, enabling individual performance enhancement without detriment to others. To achieve this objective, we introduce a novel optimization strategy aimed at seeking flat minima for each sub-problem, fostering the learning of robust sub-models resilient to changes in other sub-models. We demonstrate the advantages of our proposed method through comprehensive parameter and comparison studies on the OrganCMNIST dataset. Additionally, we evaluate its efficacy on three eye-related medical image datasets, comparing its performance against other state-of-the-art MTL approaches. The results highlight the superiority of our method over existing approaches, showcasing its potential for training multi-purpose models in medical image analysis.","PeriodicalId":94033,"journal":{"name":"IEEE transactions on medical imaging","volume":"44 4","pages":"1648-1658"},"PeriodicalIF":0.0000,"publicationDate":"2024-12-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on medical imaging","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10778634/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
While multi-task learning (MTL) has been widely developed for natural image analysis, its potential for enhancing performance in medical imaging remains relatively unexplored. Most methods formulate MTL as a multi-objective problem, inherently forcing all tasks to compete with each other during optimization. In this work, we propose a novel approach by formulating MTL as a multi-level optimization problem, in which the features learned from one task are optimized by benefiting from the other tasks. Specifically, we advocate for a cooperative approach where each task considers the features of others, enabling individual performance enhancement without detriment to others. To achieve this objective, we introduce a novel optimization strategy aimed at seeking flat minima for each sub-problem, fostering the learning of robust sub-models resilient to changes in other sub-models. We demonstrate the advantages of our proposed method through comprehensive parameter and comparison studies on the OrganCMNIST dataset. Additionally, we evaluate its efficacy on three eye-related medical image datasets, comparing its performance against other state-of-the-art MTL approaches. The results highlight the superiority of our method over existing approaches, showcasing its potential for training multi-purpose models in medical image analysis.