WAFL-GAN: Wireless Ad Hoc Federated Learning for Distributed Generative Adversarial Networks

Eisuke Tomiyama, H. Esaki, H. Ochiai
{"title":"WAFL-GAN: Wireless Ad Hoc Federated Learning for Distributed Generative Adversarial Networks","authors":"Eisuke Tomiyama, H. Esaki, H. Ochiai","doi":"10.1109/KST57286.2023.10086811","DOIUrl":null,"url":null,"abstract":"Diverse images are needed to train Generative Adversarial Network (GAN) with diverse image output, but privacy is a major issue. To protect privacy, federated learning has been proposed, but in conventional federated learning, the parameter server is a third party to the client. We propose WAFL-GAN, which does not require a third party, and which assumes that each node participating in the learning process is mobile and can communicate wirelessly with each other. Each node is trained only with the data it has locally, and when nodes opportunistically contact each other, they exchange and aggregate model parameters without exchanging raw data. This allows all nodes to eventually have a general model and produce a general output, even if each node has a dataset with a Non-IID distribution. We evaluated WAFL-GAN on the Non-IID MNIST dataset and quantitatively showed that the output diversity of WAFL-GAN can be as high as that of conventional federated learning.","PeriodicalId":351833,"journal":{"name":"2023 15th International Conference on Knowledge and Smart Technology (KST)","volume":"18 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-02-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 15th International Conference on Knowledge and Smart Technology (KST)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/KST57286.2023.10086811","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Diverse images are needed to train Generative Adversarial Network (GAN) with diverse image output, but privacy is a major issue. To protect privacy, federated learning has been proposed, but in conventional federated learning, the parameter server is a third party to the client. We propose WAFL-GAN, which does not require a third party, and which assumes that each node participating in the learning process is mobile and can communicate wirelessly with each other. Each node is trained only with the data it has locally, and when nodes opportunistically contact each other, they exchange and aggregate model parameters without exchanging raw data. This allows all nodes to eventually have a general model and produce a general output, even if each node has a dataset with a Non-IID distribution. We evaluated WAFL-GAN on the Non-IID MNIST dataset and quantitatively showed that the output diversity of WAFL-GAN can be as high as that of conventional federated learning.
分布式生成对抗网络的无线自组织联邦学习
生成对抗网络(GAN)需要不同的图像来训练不同的图像输出,但隐私是一个主要问题。为了保护隐私,提出了联邦学习,但在传统的联邦学习中,参数服务器是客户端的第三方。我们提出了WAFL-GAN,它不需要第三方,并且假设参与学习过程的每个节点都是移动的,并且可以相互无线通信。每个节点只使用其本地拥有的数据进行训练,当节点偶然地相互联系时,它们交换和聚合模型参数,而不交换原始数据。这使得所有节点最终都有一个通用模型并产生一个通用输出,即使每个节点都有一个非iid分布的数据集。我们在Non-IID MNIST数据集上评估了WAFL-GAN,并定量地表明,WAFL-GAN的输出多样性可以与传统的联邦学习一样高。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信