循环随机优化通过任意选择程序更新参数

Karla Hernandez
{"title":"循环随机优化通过任意选择程序更新参数","authors":"Karla Hernandez","doi":"10.1109/CISS.2016.7460527","DOIUrl":null,"url":null,"abstract":"Many of the algorithms that exist to tackle optimization problems (either stochastic or deterministic) are iterative in nature. Methods where only a subset of the parameter vector is updated each time have been frequently used in practice. While some of these methods update different sets of parameters according to some predetermined pattern, others select the parameters to update according to a random variable. Much work exists on the convergence of such procedures in the case of deterministic optimization. However, very little is known about their convergence when applied to general stochastic optimization problems; this is the setting this work focuses on. We describe the generalized cyclic seesaw algorithm-a general method for selecting which parameters to update during each iteration-and give sufficient conditions for its convergence.","PeriodicalId":346776,"journal":{"name":"2016 Annual Conference on Information Science and Systems (CISS)","volume":"80 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-03-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":"{\"title\":\"Cyclic stochastic optimization via arbitrary selection procedures for updating parameters\",\"authors\":\"Karla Hernandez\",\"doi\":\"10.1109/CISS.2016.7460527\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Many of the algorithms that exist to tackle optimization problems (either stochastic or deterministic) are iterative in nature. Methods where only a subset of the parameter vector is updated each time have been frequently used in practice. While some of these methods update different sets of parameters according to some predetermined pattern, others select the parameters to update according to a random variable. Much work exists on the convergence of such procedures in the case of deterministic optimization. However, very little is known about their convergence when applied to general stochastic optimization problems; this is the setting this work focuses on. We describe the generalized cyclic seesaw algorithm-a general method for selecting which parameters to update during each iteration-and give sufficient conditions for its convergence.\",\"PeriodicalId\":346776,\"journal\":{\"name\":\"2016 Annual Conference on Information Science and Systems (CISS)\",\"volume\":\"80 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2016-03-16\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"4\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2016 Annual Conference on Information Science and Systems (CISS)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/CISS.2016.7460527\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2016 Annual Conference on Information Science and Systems (CISS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CISS.2016.7460527","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4

摘要

许多用于解决优化问题(随机或确定性)的算法本质上都是迭代的。每次只更新参数向量子集的方法在实践中经常使用。虽然其中一些方法根据某些预定模式更新不同的参数集,但其他方法根据随机变量选择要更新的参数。在确定性优化的情况下,对这类过程的收敛性进行了大量的研究。然而,当应用于一般随机优化问题时,它们的收敛性知之甚少;这是本作品关注的背景。描述了一种选择每次迭代更新参数的一般方法——广义循环跷跷板算法,并给出了其收敛性的充分条件。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Cyclic stochastic optimization via arbitrary selection procedures for updating parameters
Many of the algorithms that exist to tackle optimization problems (either stochastic or deterministic) are iterative in nature. Methods where only a subset of the parameter vector is updated each time have been frequently used in practice. While some of these methods update different sets of parameters according to some predetermined pattern, others select the parameters to update according to a random variable. Much work exists on the convergence of such procedures in the case of deterministic optimization. However, very little is known about their convergence when applied to general stochastic optimization problems; this is the setting this work focuses on. We describe the generalized cyclic seesaw algorithm-a general method for selecting which parameters to update during each iteration-and give sufficient conditions for its convergence.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信