DecFL:由区块链授权的无处不在的去中心化模型训练协议和框架

Felix Morsbach, S. Toor
{"title":"DecFL:由区块链授权的无处不在的去中心化模型训练协议和框架","authors":"Felix Morsbach, S. Toor","doi":"10.1145/3457337.3457842","DOIUrl":null,"url":null,"abstract":"Machine learning has become ubiquitous across many fields in the last decade and modern real world applications often require a decentralized solution for training such models. This demand sprouted the research in federated learning, which solves some of the challenges with centralized machine learning, but at the same times raises further questions in regard to security, privacy and scalability. We have designed and implemented DecFL, an ubiquitous protocol for decentralized model training. The protocol is machine-learning-model-, vendor-, and technology-agnostic and provides a basis for practitioner's own implementations. The implemented DecFL framework presented in this article is an exemplary realization of the carefully designed protocol stack based on Ethereum and IPFS and offers a scalable baseline solution for decentralized machine learning. In this article, we present a study based on the proposed protocol, its theoretical bounds and experiments based on the implemented framework. Using open-source datasets (MNIST and CIFAR10), we demonstrate key features, the actual cost of training a model (in euro) and the communication overhead. We further show that through a proper choice of technologies DecFL achieves a linear scaling, which is a non-trivial task in a decentralized setting. Along with discussing some of the security challenges in the field, we highlight aggregation poisoning as a relevant attack vector, its associated risks and a possible prevention strategy for decentralized model training through DecFL.","PeriodicalId":270073,"journal":{"name":"Proceedings of the 3rd ACM International Symposium on Blockchain and Secure Critical Infrastructure","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2021-05-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"DecFL: An Ubiquitous Decentralized Model Training Protocol and Framework Empowered by Blockchain\",\"authors\":\"Felix Morsbach, S. Toor\",\"doi\":\"10.1145/3457337.3457842\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Machine learning has become ubiquitous across many fields in the last decade and modern real world applications often require a decentralized solution for training such models. This demand sprouted the research in federated learning, which solves some of the challenges with centralized machine learning, but at the same times raises further questions in regard to security, privacy and scalability. We have designed and implemented DecFL, an ubiquitous protocol for decentralized model training. The protocol is machine-learning-model-, vendor-, and technology-agnostic and provides a basis for practitioner's own implementations. The implemented DecFL framework presented in this article is an exemplary realization of the carefully designed protocol stack based on Ethereum and IPFS and offers a scalable baseline solution for decentralized machine learning. In this article, we present a study based on the proposed protocol, its theoretical bounds and experiments based on the implemented framework. Using open-source datasets (MNIST and CIFAR10), we demonstrate key features, the actual cost of training a model (in euro) and the communication overhead. We further show that through a proper choice of technologies DecFL achieves a linear scaling, which is a non-trivial task in a decentralized setting. Along with discussing some of the security challenges in the field, we highlight aggregation poisoning as a relevant attack vector, its associated risks and a possible prevention strategy for decentralized model training through DecFL.\",\"PeriodicalId\":270073,\"journal\":{\"name\":\"Proceedings of the 3rd ACM International Symposium on Blockchain and Secure Critical Infrastructure\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-05-24\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 3rd ACM International Symposium on Blockchain and Secure Critical Infrastructure\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3457337.3457842\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 3rd ACM International Symposium on Blockchain and Secure Critical Infrastructure","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3457337.3457842","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

摘要

在过去十年中,机器学习在许多领域变得无处不在,现代现实世界的应用通常需要一个分散的解决方案来训练这样的模型。这种需求催生了联邦学习的研究,它解决了集中式机器学习的一些挑战,但同时也提出了关于安全性、隐私性和可扩展性的进一步问题。我们设计并实现了DecFL,这是一种用于分散模型训练的无处不在的协议。该协议与机器学习模型、供应商和技术无关,并为实践者自己的实现提供了基础。本文中提出的实现DecFL框架是基于以太坊和IPFS精心设计的协议栈的典范实现,并为分散的机器学习提供了可扩展的基线解决方案。在本文中,我们基于提出的协议,其理论边界和基于实现框架的实验进行了研究。使用开源数据集(MNIST和CIFAR10),我们展示了关键特征,训练模型的实际成本(欧元)和通信开销。我们进一步表明,通过适当的技术选择,DecFL实现了线性扩展,这在分散的环境中是一项重要的任务。除了讨论该领域的一些安全挑战外,我们还强调了聚合中毒作为一个相关的攻击向量,其相关风险以及通过DecFL进行分散模型训练的可能预防策略。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
DecFL: An Ubiquitous Decentralized Model Training Protocol and Framework Empowered by Blockchain
Machine learning has become ubiquitous across many fields in the last decade and modern real world applications often require a decentralized solution for training such models. This demand sprouted the research in federated learning, which solves some of the challenges with centralized machine learning, but at the same times raises further questions in regard to security, privacy and scalability. We have designed and implemented DecFL, an ubiquitous protocol for decentralized model training. The protocol is machine-learning-model-, vendor-, and technology-agnostic and provides a basis for practitioner's own implementations. The implemented DecFL framework presented in this article is an exemplary realization of the carefully designed protocol stack based on Ethereum and IPFS and offers a scalable baseline solution for decentralized machine learning. In this article, we present a study based on the proposed protocol, its theoretical bounds and experiments based on the implemented framework. Using open-source datasets (MNIST and CIFAR10), we demonstrate key features, the actual cost of training a model (in euro) and the communication overhead. We further show that through a proper choice of technologies DecFL achieves a linear scaling, which is a non-trivial task in a decentralized setting. Along with discussing some of the security challenges in the field, we highlight aggregation poisoning as a relevant attack vector, its associated risks and a possible prevention strategy for decentralized model training through DecFL.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信