{"title":"循环随机优化通过任意选择程序更新参数","authors":"Karla Hernandez","doi":"10.1109/CISS.2016.7460527","DOIUrl":null,"url":null,"abstract":"Many of the algorithms that exist to tackle optimization problems (either stochastic or deterministic) are iterative in nature. Methods where only a subset of the parameter vector is updated each time have been frequently used in practice. While some of these methods update different sets of parameters according to some predetermined pattern, others select the parameters to update according to a random variable. Much work exists on the convergence of such procedures in the case of deterministic optimization. However, very little is known about their convergence when applied to general stochastic optimization problems; this is the setting this work focuses on. We describe the generalized cyclic seesaw algorithm-a general method for selecting which parameters to update during each iteration-and give sufficient conditions for its convergence.","PeriodicalId":346776,"journal":{"name":"2016 Annual Conference on Information Science and Systems (CISS)","volume":"80 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-03-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":"{\"title\":\"Cyclic stochastic optimization via arbitrary selection procedures for updating parameters\",\"authors\":\"Karla Hernandez\",\"doi\":\"10.1109/CISS.2016.7460527\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Many of the algorithms that exist to tackle optimization problems (either stochastic or deterministic) are iterative in nature. Methods where only a subset of the parameter vector is updated each time have been frequently used in practice. While some of these methods update different sets of parameters according to some predetermined pattern, others select the parameters to update according to a random variable. Much work exists on the convergence of such procedures in the case of deterministic optimization. However, very little is known about their convergence when applied to general stochastic optimization problems; this is the setting this work focuses on. We describe the generalized cyclic seesaw algorithm-a general method for selecting which parameters to update during each iteration-and give sufficient conditions for its convergence.\",\"PeriodicalId\":346776,\"journal\":{\"name\":\"2016 Annual Conference on Information Science and Systems (CISS)\",\"volume\":\"80 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2016-03-16\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"4\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2016 Annual Conference on Information Science and Systems (CISS)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/CISS.2016.7460527\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2016 Annual Conference on Information Science and Systems (CISS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CISS.2016.7460527","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Cyclic stochastic optimization via arbitrary selection procedures for updating parameters
Many of the algorithms that exist to tackle optimization problems (either stochastic or deterministic) are iterative in nature. Methods where only a subset of the parameter vector is updated each time have been frequently used in practice. While some of these methods update different sets of parameters according to some predetermined pattern, others select the parameters to update according to a random variable. Much work exists on the convergence of such procedures in the case of deterministic optimization. However, very little is known about their convergence when applied to general stochastic optimization problems; this is the setting this work focuses on. We describe the generalized cyclic seesaw algorithm-a general method for selecting which parameters to update during each iteration-and give sufficient conditions for its convergence.