基于梯度的联邦贝叶斯优化

IF 7.6 1区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Lin Yang , Junhua Gu , Qiqi Liu , Zhigang Zhao , Yunhe Wang , Yaochu Jin
{"title":"基于梯度的联邦贝叶斯优化","authors":"Lin Yang ,&nbsp;Junhua Gu ,&nbsp;Qiqi Liu ,&nbsp;Zhigang Zhao ,&nbsp;Yunhe Wang ,&nbsp;Yaochu Jin","doi":"10.1016/j.knosys.2025.114588","DOIUrl":null,"url":null,"abstract":"<div><div>Bayesian optimization (BO) has evolved from traditional single-agent optimization to multi-agent collaborative optimization, known as federated BO, aiming to solve global optimization tasks such as federated hyperparameter tuning. Existing research on federated BO shares weight vectors sampled from Gaussian processes, approximated using random Fourier features, with a server for information aggregation. This line of approach helps protect the privacy of agents but may limit the performance of the algorithm. Unlike existing federated BO approaches, we propose to cluster each agent according to its characteristics, and transmit the gradients of acquisition functions between the server and agents for information aggregation. This allows for a more accurate representation of the overall landscape of the global acquisition function without explicitly constructing it. Moreover, we design a two-stage mechanism to infill the next query input based on the aggregated gradients. Specifically, multiple promising solutions are first suggested based on the aggregated gradients. Then, each agent further selects the one with the best local acquisition function value as the newly infilled solution for real function evaluation. The resulting gradient-based federated BO, termed FGBO, has demonstrated to be very competitive in tackling a set of benchmark functions and real-world problems in a privacy-preserving way.</div></div>","PeriodicalId":49939,"journal":{"name":"Knowledge-Based Systems","volume":"330 ","pages":"Article 114588"},"PeriodicalIF":7.6000,"publicationDate":"2025-10-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Gradient-based federated Bayesian optimization\",\"authors\":\"Lin Yang ,&nbsp;Junhua Gu ,&nbsp;Qiqi Liu ,&nbsp;Zhigang Zhao ,&nbsp;Yunhe Wang ,&nbsp;Yaochu Jin\",\"doi\":\"10.1016/j.knosys.2025.114588\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Bayesian optimization (BO) has evolved from traditional single-agent optimization to multi-agent collaborative optimization, known as federated BO, aiming to solve global optimization tasks such as federated hyperparameter tuning. Existing research on federated BO shares weight vectors sampled from Gaussian processes, approximated using random Fourier features, with a server for information aggregation. This line of approach helps protect the privacy of agents but may limit the performance of the algorithm. Unlike existing federated BO approaches, we propose to cluster each agent according to its characteristics, and transmit the gradients of acquisition functions between the server and agents for information aggregation. This allows for a more accurate representation of the overall landscape of the global acquisition function without explicitly constructing it. Moreover, we design a two-stage mechanism to infill the next query input based on the aggregated gradients. Specifically, multiple promising solutions are first suggested based on the aggregated gradients. Then, each agent further selects the one with the best local acquisition function value as the newly infilled solution for real function evaluation. The resulting gradient-based federated BO, termed FGBO, has demonstrated to be very competitive in tackling a set of benchmark functions and real-world problems in a privacy-preserving way.</div></div>\",\"PeriodicalId\":49939,\"journal\":{\"name\":\"Knowledge-Based Systems\",\"volume\":\"330 \",\"pages\":\"Article 114588\"},\"PeriodicalIF\":7.6000,\"publicationDate\":\"2025-10-09\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Knowledge-Based Systems\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0950705125016272\",\"RegionNum\":1,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Knowledge-Based Systems","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0950705125016272","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

摘要

贝叶斯优化从传统的单智能体优化发展到多智能体协同优化,即联邦贝叶斯优化,旨在解决联邦超参数调优等全局优化任务。现有的联合BO研究利用随机傅里叶特征对高斯过程采样的权向量进行近似,并利用服务器进行信息聚合。这种方法有助于保护代理的隐私,但可能会限制算法的性能。与现有的联邦BO方法不同,我们提出根据每个agent的特征对其进行聚类,并在服务器和agent之间传递获取函数的梯度进行信息聚合。这允许更准确地表示全局获取功能的整体景观,而无需显式地构建它。此外,我们设计了一个基于聚合梯度的两阶段机制来填充下一个查询输入。具体而言,首先基于聚合梯度提出了多个有希望的解决方案。然后,每个agent进一步选择具有最佳局部获取函数值的agent作为新填充解进行实函数评价。由此产生的基于梯度的联邦BO,称为FGBO,已被证明在以保护隐私的方式处理一组基准函数和现实世界问题方面非常有竞争力。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Gradient-based federated Bayesian optimization
Bayesian optimization (BO) has evolved from traditional single-agent optimization to multi-agent collaborative optimization, known as federated BO, aiming to solve global optimization tasks such as federated hyperparameter tuning. Existing research on federated BO shares weight vectors sampled from Gaussian processes, approximated using random Fourier features, with a server for information aggregation. This line of approach helps protect the privacy of agents but may limit the performance of the algorithm. Unlike existing federated BO approaches, we propose to cluster each agent according to its characteristics, and transmit the gradients of acquisition functions between the server and agents for information aggregation. This allows for a more accurate representation of the overall landscape of the global acquisition function without explicitly constructing it. Moreover, we design a two-stage mechanism to infill the next query input based on the aggregated gradients. Specifically, multiple promising solutions are first suggested based on the aggregated gradients. Then, each agent further selects the one with the best local acquisition function value as the newly infilled solution for real function evaluation. The resulting gradient-based federated BO, termed FGBO, has demonstrated to be very competitive in tackling a set of benchmark functions and real-world problems in a privacy-preserving way.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Knowledge-Based Systems
Knowledge-Based Systems 工程技术-计算机:人工智能
CiteScore
14.80
自引率
12.50%
发文量
1245
审稿时长
7.8 months
期刊介绍: Knowledge-Based Systems, an international and interdisciplinary journal in artificial intelligence, publishes original, innovative, and creative research results in the field. It focuses on knowledge-based and other artificial intelligence techniques-based systems. The journal aims to support human prediction and decision-making through data science and computation techniques, provide a balanced coverage of theory and practical study, and encourage the development and implementation of knowledge-based intelligence models, methods, systems, and software tools. Applications in business, government, education, engineering, and healthcare are emphasized.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信