{"title":"可信执行环境下的分布式学习:SGX中联邦学习的案例研究","authors":"Tianxing Xu, Konglin Zhu, A. Andrzejak, Lin Zhang","doi":"10.1109/IC-NIDC54101.2021.9660433","DOIUrl":null,"url":null,"abstract":"Federated Learning (FL) is a distributed machine learning paradigm to solve isolated data island problems under privacy constraints. Recent works reveal that FL still exists security problems in which attackers can infer private data from gradients. In this paper, we propose a distributed FL framework in Trusted Execution Environment (TEE) to protect gradients in the perspective of hardware. We use trusted Software Guard eXtensions (SGX) as an instance to implement the FL, and proposed an SGX-FL framework. Firstly, to break through the limitation of physical memory space in SGX and meanwhile preserve the privacy, we leverage a gradient filtering mechanism to obtain the “important” gradients which preserve the utmost data privacy and put them into SGX. Secondly, to enhance the global adhesion of gradients so that the important gradients can be aggregated at maximum, a grouping method is carried out to put the most appropriate number of members into one group. Finally, to keep the accuracy of the FL model, the secondary gradients of group members and aggregated important gradients are simultaneously uploaded to the server and the computation procedure is validated by the integrity method of SGX. The evaluation results show that the proposed SGX-FL reduces the computation cost by 19 times compared with the existing approaches.","PeriodicalId":264468,"journal":{"name":"2021 7th IEEE International Conference on Network Intelligence and Digital Content (IC-NIDC)","volume":"232 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-11-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Distributed Learning in Trusted Execution Environment: A Case Study of Federated Learning in SGX\",\"authors\":\"Tianxing Xu, Konglin Zhu, A. Andrzejak, Lin Zhang\",\"doi\":\"10.1109/IC-NIDC54101.2021.9660433\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Federated Learning (FL) is a distributed machine learning paradigm to solve isolated data island problems under privacy constraints. Recent works reveal that FL still exists security problems in which attackers can infer private data from gradients. In this paper, we propose a distributed FL framework in Trusted Execution Environment (TEE) to protect gradients in the perspective of hardware. We use trusted Software Guard eXtensions (SGX) as an instance to implement the FL, and proposed an SGX-FL framework. Firstly, to break through the limitation of physical memory space in SGX and meanwhile preserve the privacy, we leverage a gradient filtering mechanism to obtain the “important” gradients which preserve the utmost data privacy and put them into SGX. Secondly, to enhance the global adhesion of gradients so that the important gradients can be aggregated at maximum, a grouping method is carried out to put the most appropriate number of members into one group. Finally, to keep the accuracy of the FL model, the secondary gradients of group members and aggregated important gradients are simultaneously uploaded to the server and the computation procedure is validated by the integrity method of SGX. The evaluation results show that the proposed SGX-FL reduces the computation cost by 19 times compared with the existing approaches.\",\"PeriodicalId\":264468,\"journal\":{\"name\":\"2021 7th IEEE International Conference on Network Intelligence and Digital Content (IC-NIDC)\",\"volume\":\"232 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-11-17\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 7th IEEE International Conference on Network Intelligence and Digital Content (IC-NIDC)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IC-NIDC54101.2021.9660433\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 7th IEEE International Conference on Network Intelligence and Digital Content (IC-NIDC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IC-NIDC54101.2021.9660433","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Distributed Learning in Trusted Execution Environment: A Case Study of Federated Learning in SGX
Federated Learning (FL) is a distributed machine learning paradigm to solve isolated data island problems under privacy constraints. Recent works reveal that FL still exists security problems in which attackers can infer private data from gradients. In this paper, we propose a distributed FL framework in Trusted Execution Environment (TEE) to protect gradients in the perspective of hardware. We use trusted Software Guard eXtensions (SGX) as an instance to implement the FL, and proposed an SGX-FL framework. Firstly, to break through the limitation of physical memory space in SGX and meanwhile preserve the privacy, we leverage a gradient filtering mechanism to obtain the “important” gradients which preserve the utmost data privacy and put them into SGX. Secondly, to enhance the global adhesion of gradients so that the important gradients can be aggregated at maximum, a grouping method is carried out to put the most appropriate number of members into one group. Finally, to keep the accuracy of the FL model, the secondary gradients of group members and aggregated important gradients are simultaneously uploaded to the server and the computation procedure is validated by the integrity method of SGX. The evaluation results show that the proposed SGX-FL reduces the computation cost by 19 times compared with the existing approaches.