{"title":"深度学习凸向量优化问题的有效前沿","authors":"Zachary Feinstein, Birgit Rudloff","doi":"10.1007/s10898-024-01408-x","DOIUrl":null,"url":null,"abstract":"<p>In this paper, we design a neural network architecture to approximate the weakly efficient frontier of convex vector optimization problems (CVOP) satisfying Slater’s condition. The proposed machine learning methodology provides both an inner and outer approximation of the weakly efficient frontier, as well as an upper bound to the error at each approximated efficient point. In numerical case studies we demonstrate that the proposed algorithm is effectively able to approximate the true weakly efficient frontier of CVOPs. This remains true even for large problems (i.e., many objectives, variables, and constraints) and thus overcoming the curse of dimensionality.</p>","PeriodicalId":15961,"journal":{"name":"Journal of Global Optimization","volume":"19 1","pages":""},"PeriodicalIF":1.8000,"publicationDate":"2024-05-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Deep learning the efficient frontier of convex vector optimization problems\",\"authors\":\"Zachary Feinstein, Birgit Rudloff\",\"doi\":\"10.1007/s10898-024-01408-x\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>In this paper, we design a neural network architecture to approximate the weakly efficient frontier of convex vector optimization problems (CVOP) satisfying Slater’s condition. The proposed machine learning methodology provides both an inner and outer approximation of the weakly efficient frontier, as well as an upper bound to the error at each approximated efficient point. In numerical case studies we demonstrate that the proposed algorithm is effectively able to approximate the true weakly efficient frontier of CVOPs. This remains true even for large problems (i.e., many objectives, variables, and constraints) and thus overcoming the curse of dimensionality.</p>\",\"PeriodicalId\":15961,\"journal\":{\"name\":\"Journal of Global Optimization\",\"volume\":\"19 1\",\"pages\":\"\"},\"PeriodicalIF\":1.8000,\"publicationDate\":\"2024-05-30\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Global Optimization\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://doi.org/10.1007/s10898-024-01408-x\",\"RegionNum\":3,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"Mathematics\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Global Optimization","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1007/s10898-024-01408-x","RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"Mathematics","Score":null,"Total":0}
Deep learning the efficient frontier of convex vector optimization problems
In this paper, we design a neural network architecture to approximate the weakly efficient frontier of convex vector optimization problems (CVOP) satisfying Slater’s condition. The proposed machine learning methodology provides both an inner and outer approximation of the weakly efficient frontier, as well as an upper bound to the error at each approximated efficient point. In numerical case studies we demonstrate that the proposed algorithm is effectively able to approximate the true weakly efficient frontier of CVOPs. This remains true even for large problems (i.e., many objectives, variables, and constraints) and thus overcoming the curse of dimensionality.
期刊介绍:
The Journal of Global Optimization publishes carefully refereed papers that encompass theoretical, computational, and applied aspects of global optimization. While the focus is on original research contributions dealing with the search for global optima of non-convex, multi-extremal problems, the journal’s scope covers optimization in the widest sense, including nonlinear, mixed integer, combinatorial, stochastic, robust, multi-objective optimization, computational geometry, and equilibrium problems. Relevant works on data-driven methods and optimization-based data mining are of special interest.
In addition to papers covering theory and algorithms of global optimization, the journal publishes significant papers on numerical experiments, new testbeds, and applications in engineering, management, and the sciences. Applications of particular interest include healthcare, computational biochemistry, energy systems, telecommunications, and finance. Apart from full-length articles, the journal features short communications on both open and solved global optimization problems. It also offers reviews of relevant books and publishes special issues.