Yanxu Su,Qingyang Sheng,Xiasheng Shi,Chaoxu Mu,Changyin Sun
{"title":"Nesterov Accelerated Gradient Tracking With Adam for Distributed Online Optimization.","authors":"Yanxu Su,Qingyang Sheng,Xiasheng Shi,Chaoxu Mu,Changyin Sun","doi":"10.1109/tnnls.2025.3604059","DOIUrl":null,"url":null,"abstract":"This article presents an accelerated distributed optimization algorithm for online optimization problems over large-scale networks. The proposed algorithm's iteration only relies on local computation and communication. To effectively adapt to dynamic changes and achieve a fast convergence rate while maintaining good convergence performance, we design a new algorithm called NGTAdam. This algorithm combines the Nesterov acceleration technique with an adaptive moment estimation method. The convergence of NGTAdam is evaluated by evaluating its dynamic regret through the use of linear system inequality. For online convex optimization problems, we provide an upper bound on the dynamic regret of NGTAdam, which depends on the initial conditions and the time-varying nature of the optimization problem. Moreover, we show that if the time-varying part of this upper bound is sublinear with time, the dynamic regret is also sublinear. Through a variety of numerical experiments, we demonstrate that NGTAdam outperforms state-of-the-art distributed online optimization algorithms.","PeriodicalId":13303,"journal":{"name":"IEEE transactions on neural networks and learning systems","volume":"32 1","pages":""},"PeriodicalIF":8.9000,"publicationDate":"2025-09-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on neural networks and learning systems","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1109/tnnls.2025.3604059","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
This article presents an accelerated distributed optimization algorithm for online optimization problems over large-scale networks. The proposed algorithm's iteration only relies on local computation and communication. To effectively adapt to dynamic changes and achieve a fast convergence rate while maintaining good convergence performance, we design a new algorithm called NGTAdam. This algorithm combines the Nesterov acceleration technique with an adaptive moment estimation method. The convergence of NGTAdam is evaluated by evaluating its dynamic regret through the use of linear system inequality. For online convex optimization problems, we provide an upper bound on the dynamic regret of NGTAdam, which depends on the initial conditions and the time-varying nature of the optimization problem. Moreover, we show that if the time-varying part of this upper bound is sublinear with time, the dynamic regret is also sublinear. Through a variety of numerical experiments, we demonstrate that NGTAdam outperforms state-of-the-art distributed online optimization algorithms.
期刊介绍:
The focus of IEEE Transactions on Neural Networks and Learning Systems is to present scholarly articles discussing the theory, design, and applications of neural networks as well as other learning systems. The journal primarily highlights technical and scientific research in this domain.