{"title":"Bayesian Stochastic Gradient Descent for Stochastic Optimization with Streaming Input Data","authors":"Tianyi Liu, Yifan Lin, Enlu Zhou","doi":"10.1137/22m1478951","DOIUrl":null,"url":null,"abstract":"SIAM Journal on Optimization, Volume 34, Issue 1, Page 389-418, March 2024. <br/> Abstract. We consider stochastic optimization under distributional uncertainty, where the unknown distributional parameter is estimated from streaming data that arrive sequentially over time. Moreover, data may depend on the decision at the time when they are generated. For both decision-independent and decision-dependent uncertainties, we propose an approach to jointly estimate the distributional parameter via Bayesian posterior distribution and update the decision by applying stochastic gradient descent (SGD) on the Bayesian average of the objective function. Our approach converges asymptotically over time and achieves the convergence rates of classical SGD in the decision-independent case. We demonstrate the empirical performance of our approach on both synthetic test problems and a classical newsvendor problem.","PeriodicalId":49529,"journal":{"name":"SIAM Journal on Optimization","volume":"214 1","pages":""},"PeriodicalIF":2.6000,"publicationDate":"2024-01-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"SIAM Journal on Optimization","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1137/22m1478951","RegionNum":1,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MATHEMATICS, APPLIED","Score":null,"Total":0}
引用次数: 0
Abstract
SIAM Journal on Optimization, Volume 34, Issue 1, Page 389-418, March 2024. Abstract. We consider stochastic optimization under distributional uncertainty, where the unknown distributional parameter is estimated from streaming data that arrive sequentially over time. Moreover, data may depend on the decision at the time when they are generated. For both decision-independent and decision-dependent uncertainties, we propose an approach to jointly estimate the distributional parameter via Bayesian posterior distribution and update the decision by applying stochastic gradient descent (SGD) on the Bayesian average of the objective function. Our approach converges asymptotically over time and achieves the convergence rates of classical SGD in the decision-independent case. We demonstrate the empirical performance of our approach on both synthetic test problems and a classical newsvendor problem.
期刊介绍:
The SIAM Journal on Optimization contains research articles on the theory and practice of optimization. The areas addressed include linear and quadratic programming, convex programming, nonlinear programming, complementarity problems, stochastic optimization, combinatorial optimization, integer programming, and convex, nonsmooth and variational analysis. Contributions may emphasize optimization theory, algorithms, software, computational practice, applications, or the links between these subjects.