{"title":"用于保护隐私的神经网络的高效同态 Argmax 近似算法","authors":"Peng Zhang, Ao Duan, Hengrui Lu","doi":"10.3390/cryptography8020018","DOIUrl":null,"url":null,"abstract":"Privacy-preserving neural networks offer a promising solution to train and predict without user privacy leakage, and fully homomorphic encryption (FHE) stands out as one of the key technologies, as it enables homomorphic operations over encrypted data. However, only addition and multiplication homomorphisms are supported by FHE, and thus, it faces huge challenges when implementing non-linear functions with ciphertext inputs. Among the non-linear functions in neural networks, one may refer to the activation function, the argmax function, and maximum pooling. Inspired by using a composition of low-degree minimax polynomials to approximate sign and argmax functions, this study focused on optimizing the homomorphic argmax approximation, where argmax is a mathematical operation that identifies the index of the maximum value within a given set of values. For the method that uses compositions of low-degree minimax polynomials to approximate argmax, in order to further reduce approximation errors and improve computational efficiency, we propose an improved homomorphic argmax approximation algorithm that includes rotation accumulation, tree-structured comparison, normalization, and finalization phases. And then, the proposed homomorphic argmax algorithm was integrated into a neural network structure. Comparative experiments indicate that the network with our proposed argmax algorithm achieved a slight increase in accuracy while significantly reducing the inference latency by 58%, as the homomorphic sign and rotation operations were rapidly reduced.","PeriodicalId":36072,"journal":{"name":"Cryptography","volume":null,"pages":null},"PeriodicalIF":1.8000,"publicationDate":"2024-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"An Efficient Homomorphic Argmax Approximation for Privacy-Preserving Neural Networks\",\"authors\":\"Peng Zhang, Ao Duan, Hengrui Lu\",\"doi\":\"10.3390/cryptography8020018\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Privacy-preserving neural networks offer a promising solution to train and predict without user privacy leakage, and fully homomorphic encryption (FHE) stands out as one of the key technologies, as it enables homomorphic operations over encrypted data. However, only addition and multiplication homomorphisms are supported by FHE, and thus, it faces huge challenges when implementing non-linear functions with ciphertext inputs. Among the non-linear functions in neural networks, one may refer to the activation function, the argmax function, and maximum pooling. Inspired by using a composition of low-degree minimax polynomials to approximate sign and argmax functions, this study focused on optimizing the homomorphic argmax approximation, where argmax is a mathematical operation that identifies the index of the maximum value within a given set of values. For the method that uses compositions of low-degree minimax polynomials to approximate argmax, in order to further reduce approximation errors and improve computational efficiency, we propose an improved homomorphic argmax approximation algorithm that includes rotation accumulation, tree-structured comparison, normalization, and finalization phases. And then, the proposed homomorphic argmax algorithm was integrated into a neural network structure. Comparative experiments indicate that the network with our proposed argmax algorithm achieved a slight increase in accuracy while significantly reducing the inference latency by 58%, as the homomorphic sign and rotation operations were rapidly reduced.\",\"PeriodicalId\":36072,\"journal\":{\"name\":\"Cryptography\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":1.8000,\"publicationDate\":\"2024-05-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Cryptography\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.3390/cryptography8020018\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"COMPUTER SCIENCE, INFORMATION SYSTEMS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Cryptography","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.3390/cryptography8020018","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
An Efficient Homomorphic Argmax Approximation for Privacy-Preserving Neural Networks
Privacy-preserving neural networks offer a promising solution to train and predict without user privacy leakage, and fully homomorphic encryption (FHE) stands out as one of the key technologies, as it enables homomorphic operations over encrypted data. However, only addition and multiplication homomorphisms are supported by FHE, and thus, it faces huge challenges when implementing non-linear functions with ciphertext inputs. Among the non-linear functions in neural networks, one may refer to the activation function, the argmax function, and maximum pooling. Inspired by using a composition of low-degree minimax polynomials to approximate sign and argmax functions, this study focused on optimizing the homomorphic argmax approximation, where argmax is a mathematical operation that identifies the index of the maximum value within a given set of values. For the method that uses compositions of low-degree minimax polynomials to approximate argmax, in order to further reduce approximation errors and improve computational efficiency, we propose an improved homomorphic argmax approximation algorithm that includes rotation accumulation, tree-structured comparison, normalization, and finalization phases. And then, the proposed homomorphic argmax algorithm was integrated into a neural network structure. Comparative experiments indicate that the network with our proposed argmax algorithm achieved a slight increase in accuracy while significantly reducing the inference latency by 58%, as the homomorphic sign and rotation operations were rapidly reduced.