J. Buhmann, Matúš Mihalák, Rastislav Šrámek, P. Widmayer
{"title":"Optimization in the presence of uncertainty","authors":"J. Buhmann, Matúš Mihalák, Rastislav Šrámek, P. Widmayer","doi":"10.3929/ETHZ-A-007315774","DOIUrl":null,"url":null,"abstract":"We study optimization in the presence of uncertainty such as noise in measurements, and advocate a novel approach to deal with it. The main di erence to any existing approach is that we do not assume any knowledge about the nature of the uncertainty (such as for instance a probability distribution). Instead, we are given several instances of the same optimization problem as input, and, assuming they are typical w.r.t. the uncertainty, we make use of it in order to compute a solution that is good for the sample instances as well as for future (unknown) expected instances. We demonstrate our approach for the case of two typical input instances. We rst propose a measure of similarity of instances with respect to an objective. This concept allows us to assess whether instances are indeed typical . Based on this concept, we then choose a solution randomly among all solutions that are near-optimum for both instances. We show that the exact notion of near-optimum is intertwined with the proposed measure of similarity. Furthermore, we will show that our measure of similarity also allows us to make formal statements about the expected quality of the computed solution: If the given instances are not similar, or are too noisy, our approach will detect this. We demonstrate for a few optimization problems and real world data that our approach not only works well in theory, but also in practice. ? This work was supported by the Swiss National Science Foundation (SNF) under the grant 200021_138117/1.","PeriodicalId":10841,"journal":{"name":"CTIT technical reports series","volume":"87 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2012-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"CTIT technical reports series","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.3929/ETHZ-A-007315774","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 6
Abstract
We study optimization in the presence of uncertainty such as noise in measurements, and advocate a novel approach to deal with it. The main di erence to any existing approach is that we do not assume any knowledge about the nature of the uncertainty (such as for instance a probability distribution). Instead, we are given several instances of the same optimization problem as input, and, assuming they are typical w.r.t. the uncertainty, we make use of it in order to compute a solution that is good for the sample instances as well as for future (unknown) expected instances. We demonstrate our approach for the case of two typical input instances. We rst propose a measure of similarity of instances with respect to an objective. This concept allows us to assess whether instances are indeed typical . Based on this concept, we then choose a solution randomly among all solutions that are near-optimum for both instances. We show that the exact notion of near-optimum is intertwined with the proposed measure of similarity. Furthermore, we will show that our measure of similarity also allows us to make formal statements about the expected quality of the computed solution: If the given instances are not similar, or are too noisy, our approach will detect this. We demonstrate for a few optimization problems and real world data that our approach not only works well in theory, but also in practice. ? This work was supported by the Swiss National Science Foundation (SNF) under the grant 200021_138117/1.