{"title":"PerHeFed:异构卷积神经网络个性化联合学习通用框架","authors":"Le Ma, YuYing Liao, Bin Zhou, Wen Xi","doi":"10.1007/s11280-022-01119-x","DOIUrl":null,"url":null,"abstract":"<p><p>In conventional federated learning, each device is restricted to train a network model of the same structure. This greatly hinders the application of federated learning where the data and devices are quite heterogeneous because of their different hardware equipment and communication networks. At the same time, existing studies have shown that transmitting all of the model parameters not only has heavy communication costs, but also increases risk of privacy leakage. We propose a general framework for personalized federated learning (PerHeFed), which enables the devices to design their local model structures autonomously and share sub-models without structural restrictions. In PerHeFed, a simple-but-effective mapping relation and a novel personalized sub-model aggregation method are proposed for heterogeneous sub-models to be aggregated. By dividing the aggregations into two primitive types (i.e., inter-layer and intra-layer), PerHeFed is applicable to any combination of heterogeneous convolutional neural networks, and we believe that this can satisfy the personalized requirements of heterogeneous models. Experiments show that, compared to the state-of-the-art method (e.g., FLOP), in non-IID data sets our method compress ≈ 50<i>%</i> of the shared sub-model parameters with only a 4.38% drop in accuracy on SVHN dataset and on CIFAR-10, PerHeFed even achieves a 0.3% improvement in accuracy. To the best of our knowledge, our work is the first general personalized federated learning framework for heterogeneous convolutional networks, even cross different networks, addressing model structure unity in conventional federated learning.</p>","PeriodicalId":49356,"journal":{"name":"World Wide Web-Internet and Web Information Systems","volume":" ","pages":"1-23"},"PeriodicalIF":2.7000,"publicationDate":"2022-12-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9743105/pdf/","citationCount":"0","resultStr":"{\"title\":\"PerHeFed: A general framework of personalized federated learning for heterogeneous convolutional neural networks.\",\"authors\":\"Le Ma, YuYing Liao, Bin Zhou, Wen Xi\",\"doi\":\"10.1007/s11280-022-01119-x\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>In conventional federated learning, each device is restricted to train a network model of the same structure. This greatly hinders the application of federated learning where the data and devices are quite heterogeneous because of their different hardware equipment and communication networks. At the same time, existing studies have shown that transmitting all of the model parameters not only has heavy communication costs, but also increases risk of privacy leakage. We propose a general framework for personalized federated learning (PerHeFed), which enables the devices to design their local model structures autonomously and share sub-models without structural restrictions. In PerHeFed, a simple-but-effective mapping relation and a novel personalized sub-model aggregation method are proposed for heterogeneous sub-models to be aggregated. By dividing the aggregations into two primitive types (i.e., inter-layer and intra-layer), PerHeFed is applicable to any combination of heterogeneous convolutional neural networks, and we believe that this can satisfy the personalized requirements of heterogeneous models. Experiments show that, compared to the state-of-the-art method (e.g., FLOP), in non-IID data sets our method compress ≈ 50<i>%</i> of the shared sub-model parameters with only a 4.38% drop in accuracy on SVHN dataset and on CIFAR-10, PerHeFed even achieves a 0.3% improvement in accuracy. To the best of our knowledge, our work is the first general personalized federated learning framework for heterogeneous convolutional networks, even cross different networks, addressing model structure unity in conventional federated learning.</p>\",\"PeriodicalId\":49356,\"journal\":{\"name\":\"World Wide Web-Internet and Web Information Systems\",\"volume\":\" \",\"pages\":\"1-23\"},\"PeriodicalIF\":2.7000,\"publicationDate\":\"2022-12-12\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9743105/pdf/\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"World Wide Web-Internet and Web Information Systems\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://doi.org/10.1007/s11280-022-01119-x\",\"RegionNum\":3,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"COMPUTER SCIENCE, INFORMATION SYSTEMS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"World Wide Web-Internet and Web Information Systems","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1007/s11280-022-01119-x","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
PerHeFed: A general framework of personalized federated learning for heterogeneous convolutional neural networks.
In conventional federated learning, each device is restricted to train a network model of the same structure. This greatly hinders the application of federated learning where the data and devices are quite heterogeneous because of their different hardware equipment and communication networks. At the same time, existing studies have shown that transmitting all of the model parameters not only has heavy communication costs, but also increases risk of privacy leakage. We propose a general framework for personalized federated learning (PerHeFed), which enables the devices to design their local model structures autonomously and share sub-models without structural restrictions. In PerHeFed, a simple-but-effective mapping relation and a novel personalized sub-model aggregation method are proposed for heterogeneous sub-models to be aggregated. By dividing the aggregations into two primitive types (i.e., inter-layer and intra-layer), PerHeFed is applicable to any combination of heterogeneous convolutional neural networks, and we believe that this can satisfy the personalized requirements of heterogeneous models. Experiments show that, compared to the state-of-the-art method (e.g., FLOP), in non-IID data sets our method compress ≈ 50% of the shared sub-model parameters with only a 4.38% drop in accuracy on SVHN dataset and on CIFAR-10, PerHeFed even achieves a 0.3% improvement in accuracy. To the best of our knowledge, our work is the first general personalized federated learning framework for heterogeneous convolutional networks, even cross different networks, addressing model structure unity in conventional federated learning.
期刊介绍:
World Wide Web: Internet and Web Information Systems (WWW) is an international, archival, peer-reviewed journal which covers all aspects of the World Wide Web, including issues related to architectures, applications, Internet and Web information systems, and communities. The purpose of this journal is to provide an international forum for researchers, professionals, and industrial practitioners to share their rapidly developing knowledge and report on new advances in Internet and web-based systems. The journal also focuses on all database- and information-system topics that relate to the Internet and the Web, particularly on ways to model, design, develop, integrate, and manage these systems.
Appearing quarterly, the journal publishes (1) papers describing original ideas and new results, (2) vision papers, (3) reviews of important techniques in related areas, (4) innovative application papers, and (5) progress reports on major international research projects. Papers published in the WWW journal deal with subjects directly or indirectly related to the World Wide Web. The WWW journal provides timely, in-depth coverage of the most recent developments in the World Wide Web discipline to enable anyone involved to keep up-to-date with this dynamically changing technology.