Optimization in the presence of uncertainty

J. Buhmann, Matúš Mihalák, Rastislav Å rámek, P. Widmayer
{"title":"Optimization in the presence of uncertainty","authors":"J. Buhmann, Matúš Mihalák, Rastislav Å rámek, P. Widmayer","doi":"10.3929/ETHZ-A-007315774","DOIUrl":null,"url":null,"abstract":"We study optimization in the presence of uncertainty such as noise in measurements, and advocate a novel approach to deal with it. The main di erence to any existing approach is that we do not assume any knowledge about the nature of the uncertainty (such as for instance a probability distribution). Instead, we are given several instances of the same optimization problem as input, and, assuming they are typical w.r.t. the uncertainty, we make use of it in order to compute a solution that is good for the sample instances as well as for future (unknown) expected instances. We demonstrate our approach for the case of two typical input instances. We rst propose a measure of similarity of instances with respect to an objective. This concept allows us to assess whether instances are indeed typical . Based on this concept, we then choose a solution randomly among all solutions that are near-optimum for both instances. We show that the exact notion of near-optimum is intertwined with the proposed measure of similarity. Furthermore, we will show that our measure of similarity also allows us to make formal statements about the expected quality of the computed solution: If the given instances are not similar, or are too noisy, our approach will detect this. We demonstrate for a few optimization problems and real world data that our approach not only works well in theory, but also in practice. ? This work was supported by the Swiss National Science Foundation (SNF) under the grant 200021_138117/1.","PeriodicalId":10841,"journal":{"name":"CTIT technical reports series","volume":"87 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2012-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"CTIT technical reports series","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.3929/ETHZ-A-007315774","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 6

Abstract

We study optimization in the presence of uncertainty such as noise in measurements, and advocate a novel approach to deal with it. The main di erence to any existing approach is that we do not assume any knowledge about the nature of the uncertainty (such as for instance a probability distribution). Instead, we are given several instances of the same optimization problem as input, and, assuming they are typical w.r.t. the uncertainty, we make use of it in order to compute a solution that is good for the sample instances as well as for future (unknown) expected instances. We demonstrate our approach for the case of two typical input instances. We rst propose a measure of similarity of instances with respect to an objective. This concept allows us to assess whether instances are indeed typical . Based on this concept, we then choose a solution randomly among all solutions that are near-optimum for both instances. We show that the exact notion of near-optimum is intertwined with the proposed measure of similarity. Furthermore, we will show that our measure of similarity also allows us to make formal statements about the expected quality of the computed solution: If the given instances are not similar, or are too noisy, our approach will detect this. We demonstrate for a few optimization problems and real world data that our approach not only works well in theory, but also in practice. ? This work was supported by the Swiss National Science Foundation (SNF) under the grant 200021_138117/1.
存在不确定性时的优化
我们研究了测量噪声等不确定性存在下的优化问题,并提出了一种新的处理方法。与任何现有方法的主要区别在于,我们不假设任何关于不确定性本质的知识(例如,概率分布)。相反,我们给出了相同优化问题的几个实例作为输入,并且,假设它们是典型的不确定性,我们利用它来计算一个适用于示例实例以及未来(未知)预期实例的解决方案。我们针对两个典型的输入实例演示了我们的方法。我们首先就一个目标提出一种度量实例相似性的方法。这个概念允许我们评估实例是否确实是典型的。基于这个概念,我们然后在所有解决方案中随机选择一个对两个实例都接近最优的解决方案。我们表明,接近最优的确切概念与所提出的相似性度量交织在一起。此外,我们将表明,我们的相似性度量还允许我们对计算解决方案的预期质量做出正式声明:如果给定的实例不相似,或者噪声太大,我们的方法将检测到这一点。我们通过一些优化问题和现实世界的数据证明,我们的方法不仅在理论上有效,而且在实践中也很有效。? 本研究由瑞士国家科学基金会(SNF)资助,项目号为200021_138117/1。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信