Confidential outsourced support vector machine learning based on well-separated structure

IF 6.2 2区 计算机科学 Q1 COMPUTER SCIENCE, THEORY & METHODS
Guoqiang Deng , Min Tang , Zengyi Huang , Yuhao Zhang , Yuxing Xi
{"title":"Confidential outsourced support vector machine learning based on well-separated structure","authors":"Guoqiang Deng ,&nbsp;Min Tang ,&nbsp;Zengyi Huang ,&nbsp;Yuhao Zhang ,&nbsp;Yuxing Xi","doi":"10.1016/j.future.2024.107564","DOIUrl":null,"url":null,"abstract":"<div><div>Support Vector Machine (SVM) has revolutionized various domains and achieved remarkable successes. This progress relies on subtle algorithms and more on large training samples. However, the massive data collection introduces security concerns. To facilitate secure integration of data efficiently for building an accurate SVM classifier, we present a non-interactive protocol for privacy-preserving SVM, named <em>NPSVMT</em>. Specifically, we define a new well-separated structure for computing gradients that can decouple the fusion matter between user data and model parameters, allowing data providers to outsource the collaborative learning task to the cloud. As a result, <em>NPSVMT</em> is capable of removing the multiple communications and eliminating the straggler’s effect (waiting for the last), thereby going beyond those developed with interactive methods, e.g., federated learning. To further decrease the data traffic, we introduce a high-efficient coding method to compress and parse training data. In addition, unlike outsourced schemes based on homomorphic encryption or secret sharing, <em>NPSVMT</em> exploits functional encryption to maintain the data confidentiality, achieving dropout-tolerant secure aggregation. The implementations verify that <em>NPSVMT</em> is faster by orders of magnitude than the existing privacy-preserving SVM schemes on benchmark datasets.</div></div>","PeriodicalId":55132,"journal":{"name":"Future Generation Computer Systems-The International Journal of Escience","volume":null,"pages":null},"PeriodicalIF":6.2000,"publicationDate":"2024-10-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Future Generation Computer Systems-The International Journal of Escience","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0167739X24005284","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, THEORY & METHODS","Score":null,"Total":0}
引用次数: 0

Abstract

Support Vector Machine (SVM) has revolutionized various domains and achieved remarkable successes. This progress relies on subtle algorithms and more on large training samples. However, the massive data collection introduces security concerns. To facilitate secure integration of data efficiently for building an accurate SVM classifier, we present a non-interactive protocol for privacy-preserving SVM, named NPSVMT. Specifically, we define a new well-separated structure for computing gradients that can decouple the fusion matter between user data and model parameters, allowing data providers to outsource the collaborative learning task to the cloud. As a result, NPSVMT is capable of removing the multiple communications and eliminating the straggler’s effect (waiting for the last), thereby going beyond those developed with interactive methods, e.g., federated learning. To further decrease the data traffic, we introduce a high-efficient coding method to compress and parse training data. In addition, unlike outsourced schemes based on homomorphic encryption or secret sharing, NPSVMT exploits functional encryption to maintain the data confidentiality, achieving dropout-tolerant secure aggregation. The implementations verify that NPSVMT is faster by orders of magnitude than the existing privacy-preserving SVM schemes on benchmark datasets.
基于良好分离结构的机密外包支持向量机学习
支持向量机(SVM)给各个领域带来了革命性的变化,并取得了令人瞩目的成就。这一进步依赖于精妙的算法,更依赖于大量的训练样本。然而,海量数据收集带来了安全问题。为了促进数据的安全整合,高效地构建准确的 SVM 分类器,我们提出了一种用于保护隐私的 SVM 的非交互式协议,命名为 NPSVMT。具体来说,我们定义了一种新的梯度计算分离结构,它可以将用户数据和模型参数之间的融合事项解耦,允许数据提供商将协作学习任务外包给云。因此,NPSVMT 能够消除多重通信,并消除滞后效应(等待最后一个),从而超越那些用交互式方法(如联合学习)开发的方法。为了进一步减少数据流量,我们引入了一种高效的编码方法来压缩和解析训练数据。此外,与基于同态加密或秘密共享的外包方案不同,NPSVMT 利用功能加密来维护数据机密性,实现了容错安全聚合。实施验证了 NPSVMT 在基准数据集上比现有的隐私保护 SVM 方案快了几个数量级。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
19.90
自引率
2.70%
发文量
376
审稿时长
10.6 months
期刊介绍: Computing infrastructures and systems are constantly evolving, resulting in increasingly complex and collaborative scientific applications. To cope with these advancements, there is a growing need for collaborative tools that can effectively map, control, and execute these applications. Furthermore, with the explosion of Big Data, there is a requirement for innovative methods and infrastructures to collect, analyze, and derive meaningful insights from the vast amount of data generated. This necessitates the integration of computational and storage capabilities, databases, sensors, and human collaboration. Future Generation Computer Systems aims to pioneer advancements in distributed systems, collaborative environments, high-performance computing, and Big Data analytics. It strives to stay at the forefront of developments in grids, clouds, and the Internet of Things (IoT) to effectively address the challenges posed by these wide-area, fully distributed sensing and computing systems.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信