Optimization Methods & Software最新文献

筛选
英文 中文
A gradient descent akin method for constrained optimization: algorithms and applications 用于约束优化的梯度下降类似方法:算法与应用
IF 2.2 3区 数学
Optimization Methods & Software Pub Date : 2024-01-16 DOI: 10.1080/10556788.2023.2285450
Long Chen, Kai-Uwe Bletzinger, Nicolas R. Gauger, Yinyu Ye
{"title":"A gradient descent akin method for constrained optimization: algorithms and applications","authors":"Long Chen, Kai-Uwe Bletzinger, Nicolas R. Gauger, Yinyu Ye","doi":"10.1080/10556788.2023.2285450","DOIUrl":"https://doi.org/10.1080/10556788.2023.2285450","url":null,"abstract":"We present a ‘gradient descent akin’ method (GDAM) for constrained optimization problem, i.e. the search direction is computed using a linear combination of the negative and normalized objective an...","PeriodicalId":54673,"journal":{"name":"Optimization Methods & Software","volume":null,"pages":null},"PeriodicalIF":2.2,"publicationDate":"2024-01-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139476771","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
On a Frank-Wolfe approach for abs-smooth functions 关于abs-光滑函数的弗兰克-沃尔夫方法
IF 2.2 3区 数学
Optimization Methods & Software Pub Date : 2024-01-16 DOI: 10.1080/10556788.2023.2296985
Timo Kreimeier, Sebastian Pokutta, Andrea Walther, Zev Woodstock
{"title":"On a Frank-Wolfe approach for abs-smooth functions","authors":"Timo Kreimeier, Sebastian Pokutta, Andrea Walther, Zev Woodstock","doi":"10.1080/10556788.2023.2296985","DOIUrl":"https://doi.org/10.1080/10556788.2023.2296985","url":null,"abstract":"We propose an algorithm which appears to be the first bridge between the fields of conditional gradient methods and abs-smooth optimization. Our problem setting is motivated by various applications...","PeriodicalId":54673,"journal":{"name":"Optimization Methods & Software","volume":null,"pages":null},"PeriodicalIF":2.2,"publicationDate":"2024-01-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139477062","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
PersA-FL: personalized asynchronous federated learning PersA-FL:个性化异步联合学习
IF 2.2 3区 数学
Optimization Methods & Software Pub Date : 2024-01-11 DOI: 10.1080/10556788.2023.2280056
Mohammad Taha Toghani, Soomin Lee, César A. Uribe
{"title":"PersA-FL: personalized asynchronous federated learning","authors":"Mohammad Taha Toghani, Soomin Lee, César A. Uribe","doi":"10.1080/10556788.2023.2280056","DOIUrl":"https://doi.org/10.1080/10556788.2023.2280056","url":null,"abstract":"We study the personalized federated learning problem under asynchronous updates. In this problem, each client seeks to obtain a personalized model that simultaneously outperforms local and global m...","PeriodicalId":54673,"journal":{"name":"Optimization Methods & Software","volume":null,"pages":null},"PeriodicalIF":2.2,"publicationDate":"2024-01-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140025716","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
An ADMM based method for underdetermined box-constrained integer least squares problems 基于 ADMM 的欠确定箱约束整数最小二乘法问题方法
IF 2.2 3区 数学
Optimization Methods & Software Pub Date : 2023-12-28 DOI: 10.1080/10556788.2023.2285492
Xiao-Wen Chang, Tianchi Ma
{"title":"An ADMM based method for underdetermined box-constrained integer least squares problems","authors":"Xiao-Wen Chang, Tianchi Ma","doi":"10.1080/10556788.2023.2285492","DOIUrl":"https://doi.org/10.1080/10556788.2023.2285492","url":null,"abstract":"To solve underdetermined box-constrained integer least squares (UBILS) problems, we propose an integer-constrained alternating direction method of multipliers (IADMM), which can be much more accura...","PeriodicalId":54673,"journal":{"name":"Optimization Methods & Software","volume":null,"pages":null},"PeriodicalIF":2.2,"publicationDate":"2023-12-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139079041","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Customized Douglas-Rachford splitting methods for structured inverse variational inequality problems 结构化变分逆不等式问题的自定义Douglas-Rachford分裂方法
IF 2.2 3区 数学
Optimization Methods & Software Pub Date : 2023-11-24 DOI: 10.1080/10556788.2023.2278092
Y. N. Jiang, X. J. Cai, D. R. Han, J. F. Yang
{"title":"Customized Douglas-Rachford splitting methods for structured inverse variational inequality problems","authors":"Y. N. Jiang, X. J. Cai, D. R. Han, J. F. Yang","doi":"10.1080/10556788.2023.2278092","DOIUrl":"https://doi.org/10.1080/10556788.2023.2278092","url":null,"abstract":"Recently, structured inverse variational inequality (SIVI) problems have attracted much attention. In this paper, we propose new splitting methods to solve SIVI problems by employing the idea of th...","PeriodicalId":54673,"journal":{"name":"Optimization Methods & Software","volume":null,"pages":null},"PeriodicalIF":2.2,"publicationDate":"2023-11-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138508364","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Dual formulation of the sparsity constrained optimization problem: application to classification 稀疏约束优化问题的对偶表述:在分类中的应用
IF 2.2 3区 数学
Optimization Methods & Software Pub Date : 2023-11-21 DOI: 10.1080/10556788.2023.2278091
M. Gaudioso, G. Giallombardo, J.-B. Hiriart-Urruty
{"title":"Dual formulation of the sparsity constrained optimization problem: application to classification","authors":"M. Gaudioso, G. Giallombardo, J.-B. Hiriart-Urruty","doi":"10.1080/10556788.2023.2278091","DOIUrl":"https://doi.org/10.1080/10556788.2023.2278091","url":null,"abstract":"We tackle the sparsity constrained optimization problem by resorting to polyhedral k-norm as a valid tool to emulate the ℓ0-pseudo-norm. The main novelty of the approach is the use of the dual of t...","PeriodicalId":54673,"journal":{"name":"Optimization Methods & Software","volume":null,"pages":null},"PeriodicalIF":2.2,"publicationDate":"2023-11-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138508380","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Inexact tensor methods and their application to stochastic convex optimization 非精确张量方法及其在随机凸优化中的应用
IF 2.2 3区 数学
Optimization Methods & Software Pub Date : 2023-11-17 DOI: 10.1080/10556788.2023.2261604
Artem Agafonov, Dmitry Kamzolov, Pavel Dvurechensky, Alexander Gasnikov, Martin Takáč
{"title":"Inexact tensor methods and their application to stochastic convex optimization","authors":"Artem Agafonov, Dmitry Kamzolov, Pavel Dvurechensky, Alexander Gasnikov, Martin Takáč","doi":"10.1080/10556788.2023.2261604","DOIUrl":"https://doi.org/10.1080/10556788.2023.2261604","url":null,"abstract":"We propose general non-accelerated [The results for non-accelerated methods first appeared in December 2020 in the preprint (A. Agafonov, D. Kamzolov, P. Dvurechensky, and A. Gasnikov, Inexact tens...","PeriodicalId":54673,"journal":{"name":"Optimization Methods & Software","volume":null,"pages":null},"PeriodicalIF":2.2,"publicationDate":"2023-11-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138508365","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 9
A hybrid direct search and projected simplex gradient method for convex constrained minimization 一种混合直接搜索和投影单纯形梯度法求解凸约束最小化
IF 2.2 3区 数学
Optimization Methods & Software Pub Date : 2023-11-15 DOI: 10.1080/10556788.2023.2263618
A. L. Custódio, E. H. M. Krulikovski, M. Raydan
{"title":"A hybrid direct search and projected simplex gradient method for convex constrained minimization","authors":"A. L. Custódio, E. H. M. Krulikovski, M. Raydan","doi":"10.1080/10556788.2023.2263618","DOIUrl":"https://doi.org/10.1080/10556788.2023.2263618","url":null,"abstract":"We propose a new Derivative-free Optimization (DFO) approach for solving convex constrained minimization problems. The feasible set is assumed to be the non-empty intersection of a finite collectio...","PeriodicalId":54673,"journal":{"name":"Optimization Methods & Software","volume":null,"pages":null},"PeriodicalIF":2.2,"publicationDate":"2023-11-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138508381","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Inducing strong convergence into the asymptotic behaviour of proximal splitting algorithms in Hilbert spaces. Hilbert空间中近端分裂算法的渐近性的强收敛性。
IF 2.2 3区 数学
Optimization Methods & Software Pub Date : 2018-04-10 eCollection Date: 2019-01-01 DOI: 10.1080/10556788.2018.1457151
Radu Ioan Boţ, Ernö Robert Csetnek, Dennis Meier
{"title":"Inducing strong convergence into the asymptotic behaviour of proximal splitting algorithms in Hilbert spaces.","authors":"Radu Ioan Boţ,&nbsp;Ernö Robert Csetnek,&nbsp;Dennis Meier","doi":"10.1080/10556788.2018.1457151","DOIUrl":"https://doi.org/10.1080/10556788.2018.1457151","url":null,"abstract":"<p><p>Proximal splitting algorithms for monotone inclusions (and convex optimization problems) in Hilbert spaces share the common feature to guarantee for the generated sequences in general weak convergence to a solution. In order to achieve strong convergence, one usually needs to impose more restrictive properties for the involved operators, like strong monotonicity (respectively, strong convexity for optimization problems). In this paper, we propose a modified Krasnosel'skiĭ-Mann algorithm in connection with the determination of a fixed point of a nonexpansive mapping and show strong convergence of the iteratively generated sequence to the minimal norm solution of the problem. Relying on this, we derive a forward-backward and a Douglas-Rachford algorithm, both endowed with Tikhonov regularization terms, which generate iterates that strongly converge to the minimal norm solution of the set of zeros of the sum of two maximally monotone operators. Furthermore, we formulate strong convergent primal-dual algorithms of forward-backward and Douglas-Rachford-type for highly structured monotone inclusion problems involving parallel-sums and compositions with linear operators. The resulting iterative schemes are particularized to the solving of convex minimization problems. The theoretical results are illustrated by numerical experiments on the split feasibility problem in infinite dimensional spaces.</p>","PeriodicalId":54673,"journal":{"name":"Optimization Methods & Software","volume":null,"pages":null},"PeriodicalIF":2.2,"publicationDate":"2018-04-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1080/10556788.2018.1457151","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"37211806","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 33
Self-consistent gradient flow for shape optimization. 用于形状优化的自一致梯度流。
IF 2.2 3区 数学
Optimization Methods & Software Pub Date : 2017-07-04 Epub Date: 2016-05-01 DOI: 10.1080/10556788.2016.1171864
D Kraft
{"title":"Self-consistent gradient flow for shape optimization.","authors":"D Kraft","doi":"10.1080/10556788.2016.1171864","DOIUrl":"https://doi.org/10.1080/10556788.2016.1171864","url":null,"abstract":"<p><p>We present a model for image segmentation and describe a gradient-descent method for level-set based shape optimization. It is commonly known that gradient-descent methods converge slowly due to zig-zag movement. This can also be observed for our problem, especially when sharp edges are present in the image. We interpret this in our specific context to gain a better understanding of the involved difficulties. One way to overcome slow convergence is the use of second-order methods. For our situation, they require derivatives of the potentially noisy image data and are thus undesirable. Hence, we propose a new method that can be interpreted as a self-consistent gradient flow and does not need any derivatives of the image data. It works very well in practice and leads to a far more efficient optimization algorithm. A related idea can also be used to describe the mean-curvature flow of a mean-convex surface. For this, we formulate a mean-curvature Eikonal equation, which allows a numerical propagation of the mean-curvature flow of a surface without explicit time stepping.</p>","PeriodicalId":54673,"journal":{"name":"Optimization Methods & Software","volume":null,"pages":null},"PeriodicalIF":2.2,"publicationDate":"2017-07-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1080/10556788.2016.1171864","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"35135558","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信