Lin Yang , Junhua Gu , Qiqi Liu , Zhigang Zhao , Yunhe Wang , Yaochu Jin
{"title":"基于梯度的联邦贝叶斯优化","authors":"Lin Yang , Junhua Gu , Qiqi Liu , Zhigang Zhao , Yunhe Wang , Yaochu Jin","doi":"10.1016/j.knosys.2025.114588","DOIUrl":null,"url":null,"abstract":"<div><div>Bayesian optimization (BO) has evolved from traditional single-agent optimization to multi-agent collaborative optimization, known as federated BO, aiming to solve global optimization tasks such as federated hyperparameter tuning. Existing research on federated BO shares weight vectors sampled from Gaussian processes, approximated using random Fourier features, with a server for information aggregation. This line of approach helps protect the privacy of agents but may limit the performance of the algorithm. Unlike existing federated BO approaches, we propose to cluster each agent according to its characteristics, and transmit the gradients of acquisition functions between the server and agents for information aggregation. This allows for a more accurate representation of the overall landscape of the global acquisition function without explicitly constructing it. Moreover, we design a two-stage mechanism to infill the next query input based on the aggregated gradients. Specifically, multiple promising solutions are first suggested based on the aggregated gradients. Then, each agent further selects the one with the best local acquisition function value as the newly infilled solution for real function evaluation. The resulting gradient-based federated BO, termed FGBO, has demonstrated to be very competitive in tackling a set of benchmark functions and real-world problems in a privacy-preserving way.</div></div>","PeriodicalId":49939,"journal":{"name":"Knowledge-Based Systems","volume":"330 ","pages":"Article 114588"},"PeriodicalIF":7.6000,"publicationDate":"2025-10-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Gradient-based federated Bayesian optimization\",\"authors\":\"Lin Yang , Junhua Gu , Qiqi Liu , Zhigang Zhao , Yunhe Wang , Yaochu Jin\",\"doi\":\"10.1016/j.knosys.2025.114588\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Bayesian optimization (BO) has evolved from traditional single-agent optimization to multi-agent collaborative optimization, known as federated BO, aiming to solve global optimization tasks such as federated hyperparameter tuning. Existing research on federated BO shares weight vectors sampled from Gaussian processes, approximated using random Fourier features, with a server for information aggregation. This line of approach helps protect the privacy of agents but may limit the performance of the algorithm. Unlike existing federated BO approaches, we propose to cluster each agent according to its characteristics, and transmit the gradients of acquisition functions between the server and agents for information aggregation. This allows for a more accurate representation of the overall landscape of the global acquisition function without explicitly constructing it. Moreover, we design a two-stage mechanism to infill the next query input based on the aggregated gradients. Specifically, multiple promising solutions are first suggested based on the aggregated gradients. Then, each agent further selects the one with the best local acquisition function value as the newly infilled solution for real function evaluation. The resulting gradient-based federated BO, termed FGBO, has demonstrated to be very competitive in tackling a set of benchmark functions and real-world problems in a privacy-preserving way.</div></div>\",\"PeriodicalId\":49939,\"journal\":{\"name\":\"Knowledge-Based Systems\",\"volume\":\"330 \",\"pages\":\"Article 114588\"},\"PeriodicalIF\":7.6000,\"publicationDate\":\"2025-10-09\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Knowledge-Based Systems\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0950705125016272\",\"RegionNum\":1,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Knowledge-Based Systems","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0950705125016272","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
Bayesian optimization (BO) has evolved from traditional single-agent optimization to multi-agent collaborative optimization, known as federated BO, aiming to solve global optimization tasks such as federated hyperparameter tuning. Existing research on federated BO shares weight vectors sampled from Gaussian processes, approximated using random Fourier features, with a server for information aggregation. This line of approach helps protect the privacy of agents but may limit the performance of the algorithm. Unlike existing federated BO approaches, we propose to cluster each agent according to its characteristics, and transmit the gradients of acquisition functions between the server and agents for information aggregation. This allows for a more accurate representation of the overall landscape of the global acquisition function without explicitly constructing it. Moreover, we design a two-stage mechanism to infill the next query input based on the aggregated gradients. Specifically, multiple promising solutions are first suggested based on the aggregated gradients. Then, each agent further selects the one with the best local acquisition function value as the newly infilled solution for real function evaluation. The resulting gradient-based federated BO, termed FGBO, has demonstrated to be very competitive in tackling a set of benchmark functions and real-world problems in a privacy-preserving way.
期刊介绍:
Knowledge-Based Systems, an international and interdisciplinary journal in artificial intelligence, publishes original, innovative, and creative research results in the field. It focuses on knowledge-based and other artificial intelligence techniques-based systems. The journal aims to support human prediction and decision-making through data science and computation techniques, provide a balanced coverage of theory and practical study, and encourage the development and implementation of knowledge-based intelligence models, methods, systems, and software tools. Applications in business, government, education, engineering, and healthcare are emphasized.