Jianian Zhu, Yichen Li, Haozhao Wang, Yining Qi, Ruixuan Li
{"title":"针对联合图分类的超网络驱动集中式对比学习","authors":"Jianian Zhu, Yichen Li, Haozhao Wang, Yining Qi, Ruixuan Li","doi":"10.1007/s11280-024-01292-1","DOIUrl":null,"url":null,"abstract":"<p>In the domain of Graph Federated Learning (GFL), prevalent methods often focus on local client data, which can limit the understanding of broader global patterns and pose challenges with Non-IID (Non-Independent and Identically Distributed) issues in cross-domain datasets. Direct aggregation can lead to a reduction in the differences among various clients, which is detrimental to personalized datasets. Contrastive Learning (CL) has emerged as an effective tool for enhancing a model’s ability to distinguish variations across diverse views but has not been fully leveraged in GFL. This study introduces a novel hypernetwork-based method, termed CCL (Centralized Contrastive Learning), which is a server-centric innovation that effectively addresses the challenges posed by traditional client-centric approaches in heterogeneous datasets. CCL integrates global patterns from multiple clients, capturing a wider range of patterns and significantly improving GFL performance. Our extensive experiments, including both supervised and unsupervised scenarios, demonstrate CCL’s superiority over existing models, its remarkable compatibility with standard backbones, and its ability to enhance GFL performance across various settings.</p>","PeriodicalId":501180,"journal":{"name":"World Wide Web","volume":"4 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-08-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Hypernetwork-driven centralized contrastive learning for federated graph classification\",\"authors\":\"Jianian Zhu, Yichen Li, Haozhao Wang, Yining Qi, Ruixuan Li\",\"doi\":\"10.1007/s11280-024-01292-1\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>In the domain of Graph Federated Learning (GFL), prevalent methods often focus on local client data, which can limit the understanding of broader global patterns and pose challenges with Non-IID (Non-Independent and Identically Distributed) issues in cross-domain datasets. Direct aggregation can lead to a reduction in the differences among various clients, which is detrimental to personalized datasets. Contrastive Learning (CL) has emerged as an effective tool for enhancing a model’s ability to distinguish variations across diverse views but has not been fully leveraged in GFL. This study introduces a novel hypernetwork-based method, termed CCL (Centralized Contrastive Learning), which is a server-centric innovation that effectively addresses the challenges posed by traditional client-centric approaches in heterogeneous datasets. CCL integrates global patterns from multiple clients, capturing a wider range of patterns and significantly improving GFL performance. Our extensive experiments, including both supervised and unsupervised scenarios, demonstrate CCL’s superiority over existing models, its remarkable compatibility with standard backbones, and its ability to enhance GFL performance across various settings.</p>\",\"PeriodicalId\":501180,\"journal\":{\"name\":\"World Wide Web\",\"volume\":\"4 1\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-08-16\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"World Wide Web\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1007/s11280-024-01292-1\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"World Wide Web","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1007/s11280-024-01292-1","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Hypernetwork-driven centralized contrastive learning for federated graph classification
In the domain of Graph Federated Learning (GFL), prevalent methods often focus on local client data, which can limit the understanding of broader global patterns and pose challenges with Non-IID (Non-Independent and Identically Distributed) issues in cross-domain datasets. Direct aggregation can lead to a reduction in the differences among various clients, which is detrimental to personalized datasets. Contrastive Learning (CL) has emerged as an effective tool for enhancing a model’s ability to distinguish variations across diverse views but has not been fully leveraged in GFL. This study introduces a novel hypernetwork-based method, termed CCL (Centralized Contrastive Learning), which is a server-centric innovation that effectively addresses the challenges posed by traditional client-centric approaches in heterogeneous datasets. CCL integrates global patterns from multiple clients, capturing a wider range of patterns and significantly improving GFL performance. Our extensive experiments, including both supervised and unsupervised scenarios, demonstrate CCL’s superiority over existing models, its remarkable compatibility with standard backbones, and its ability to enhance GFL performance across various settings.