{"title":"多目标优化的增量下降法","authors":"I. F. D. Oliveira, R. Takahashi","doi":"10.1080/10556788.2022.2124989","DOIUrl":null,"url":null,"abstract":"ABSTRACT Multi-objective steepest descent, under the assumption of lower-bounded objective functions with L-Lipschitz continuous gradients, requires gradient and function computations to produce a measure of proximity to critical conditions akin to in the single-objective setting, where m is the number of objectives considered. We reduce this to with a multi-objective incremental approach that has a computational cost that does not grow with the number of objective functions m.","PeriodicalId":124811,"journal":{"name":"Optimization Methods and Software","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-05-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"An incremental descent method for multi-objective optimization\",\"authors\":\"I. F. D. Oliveira, R. Takahashi\",\"doi\":\"10.1080/10556788.2022.2124989\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"ABSTRACT Multi-objective steepest descent, under the assumption of lower-bounded objective functions with L-Lipschitz continuous gradients, requires gradient and function computations to produce a measure of proximity to critical conditions akin to in the single-objective setting, where m is the number of objectives considered. We reduce this to with a multi-objective incremental approach that has a computational cost that does not grow with the number of objective functions m.\",\"PeriodicalId\":124811,\"journal\":{\"name\":\"Optimization Methods and Software\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-05-25\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Optimization Methods and Software\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1080/10556788.2022.2124989\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Optimization Methods and Software","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1080/10556788.2022.2124989","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
An incremental descent method for multi-objective optimization
ABSTRACT Multi-objective steepest descent, under the assumption of lower-bounded objective functions with L-Lipschitz continuous gradients, requires gradient and function computations to produce a measure of proximity to critical conditions akin to in the single-objective setting, where m is the number of objectives considered. We reduce this to with a multi-objective incremental approach that has a computational cost that does not grow with the number of objective functions m.