{"title":"Adaptive noisy importance sampling for stochastic optimization","authors":"Ö. D. Akyildiz, I. P. Mariño, J. Míguez","doi":"10.1109/CAMSAP.2017.8313215","DOIUrl":null,"url":null,"abstract":"In this work, we introduce an adaptive noisy importance sampler (ANIS) for optimization in an online setting. ANIS is an extension of the family of adaptive importance samplers where the weights are only approximate as they are computed via subsampling of the available data. Allowing errors in the weights enables us to use the algorithm in the so-called large-scale optimization setting, where the cost function consists of the sum of many component functions. ANIS can be used to optimize general cost functions as it does not need any gradient information to update the parameters. We show how the weights of ANIS are related to those of adaptive importance samplers and present some computer simulation results.","PeriodicalId":315977,"journal":{"name":"2017 IEEE 7th International Workshop on Computational Advances in Multi-Sensor Adaptive Processing (CAMSAP)","volume":"43 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 IEEE 7th International Workshop on Computational Advances in Multi-Sensor Adaptive Processing (CAMSAP)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CAMSAP.2017.8313215","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 5
Abstract
In this work, we introduce an adaptive noisy importance sampler (ANIS) for optimization in an online setting. ANIS is an extension of the family of adaptive importance samplers where the weights are only approximate as they are computed via subsampling of the available data. Allowing errors in the weights enables us to use the algorithm in the so-called large-scale optimization setting, where the cost function consists of the sum of many component functions. ANIS can be used to optimize general cost functions as it does not need any gradient information to update the parameters. We show how the weights of ANIS are related to those of adaptive importance samplers and present some computer simulation results.