{"title":"Stochastic Augmented Lagrangian Method in Riemannian Shape Manifolds","authors":"Caroline Geiersbach, Tim Suchan, Kathrin Welker","doi":"10.1007/s10957-024-02488-1","DOIUrl":"https://doi.org/10.1007/s10957-024-02488-1","url":null,"abstract":"<p>In this paper, we present a stochastic augmented Lagrangian approach on (possibly infinite-dimensional) Riemannian manifolds to solve stochastic optimization problems with a finite number of deterministic constraints. We investigate the convergence of the method, which is based on a stochastic approximation approach with random stopping combined with an iterative procedure for updating Lagrange multipliers. The algorithm is applied to a multi-shape optimization problem with geometric constraints and demonstrated numerically.</p>","PeriodicalId":50100,"journal":{"name":"Journal of Optimization Theory and Applications","volume":null,"pages":null},"PeriodicalIF":1.9,"publicationDate":"2024-08-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142186671","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Regularized and Structured Tensor Total Least Squares Methods with Applications","authors":"Feiyang Han, Yimin Wei, Pengpeng Xie","doi":"10.1007/s10957-024-02507-1","DOIUrl":"https://doi.org/10.1007/s10957-024-02507-1","url":null,"abstract":"<p>Total least squares (TLS), also named as errors in variables in statistical analysis, is an effective method for solving linear equations with the situations, when noise is not just in observation data but also in mapping operations. Besides, the Tikhonov regularization is widely considered in plenty of ill-posed problems. Moreover, the structure of mapping operator plays a crucial role in solving the TLS problem. Tensor operators have some advantages over the characterization of models, which requires us to build the corresponding theory on the tensor TLS. This paper proposes tensor regularized TLS and structured tensor TLS methods for solving ill-conditioned and structured tensor equations, respectively, adopting a tensor-tensor-product. Properties and algorithms for the solution of these approaches are also presented and proved. Based on this method, some applications in image and video deblurring are explored. Numerical examples illustrate the effectiveness of our methods, compared with some existing methods.</p>","PeriodicalId":50100,"journal":{"name":"Journal of Optimization Theory and Applications","volume":null,"pages":null},"PeriodicalIF":1.9,"publicationDate":"2024-08-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142186672","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Continuous Equality Knapsack with Probit-Style Objectives","authors":"Jamie Fravel, Robert Hildebrand, Laurel Travis","doi":"10.1007/s10957-024-02503-5","DOIUrl":"https://doi.org/10.1007/s10957-024-02503-5","url":null,"abstract":"<p>We study continuous, equality knapsack problems with uniform separable, non-convex objective functions that are continuous, antisymmetric about a point, and have concave and convex regions. For example, this model captures a simple allocation problem with the goal of optimizing an expected value where the objective is a sum of cumulative distribution functions of identically distributed normal distributions (i.e., a sum of inverse probit functions). We prove structural results of this model under general assumptions and provide two algorithms for efficient optimization: (1) running in linear time and (2) running in a constant number of operations given preprocessing of the objective function.</p>","PeriodicalId":50100,"journal":{"name":"Journal of Optimization Theory and Applications","volume":null,"pages":null},"PeriodicalIF":1.9,"publicationDate":"2024-08-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141935095","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"On Quasiconvex Multiobjective Optimization and Variational Inequalities Using Greenberg–Pierskalla Based Generalized Subdifferentials","authors":"Shashi Kant Mishra, Vivek Laha, Mohd Hassan","doi":"10.1007/s10957-024-02505-3","DOIUrl":"https://doi.org/10.1007/s10957-024-02505-3","url":null,"abstract":"","PeriodicalId":50100,"journal":{"name":"Journal of Optimization Theory and Applications","volume":null,"pages":null},"PeriodicalIF":1.6,"publicationDate":"2024-08-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141924464","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Observer-Based Feedback-Control for the Stabilization of a Class of Parabolic Systems","authors":"Imene Aicha Djebour, Karim Ramdani, J. Valein","doi":"10.1007/s10957-024-02496-1","DOIUrl":"https://doi.org/10.1007/s10957-024-02496-1","url":null,"abstract":"","PeriodicalId":50100,"journal":{"name":"Journal of Optimization Theory and Applications","volume":null,"pages":null},"PeriodicalIF":1.6,"publicationDate":"2024-08-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141929544","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Optimal Actuator Location of the Norm Optimal Controls for Degenerate Parabolic Equations","authors":"Yuanhang Liu, Weijia Wu, Donghui Yang","doi":"10.1007/s10957-024-02498-z","DOIUrl":"https://doi.org/10.1007/s10957-024-02498-z","url":null,"abstract":"<p>This paper focuses on investigating the optimal actuator location for achieving minimum norm controls in the context of approximate controllability for degenerate parabolic equations. We propose a formulation of the optimization problem that encompasses both the actuator location and its associated minimum norm control. Specifically, we transform the problem into a two-person zero-sum game problem, resulting in the development of four equivalent formulations. Finally, we establish the crucial result that the solution to the relaxed optimization problem serves as an optimal actuator location for the classical problem.</p>","PeriodicalId":50100,"journal":{"name":"Journal of Optimization Theory and Applications","volume":null,"pages":null},"PeriodicalIF":1.9,"publicationDate":"2024-08-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141935096","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Most Iterations of Projections Converge","authors":"Daylen K. Thimm","doi":"10.1007/s10957-024-02504-4","DOIUrl":"https://doi.org/10.1007/s10957-024-02504-4","url":null,"abstract":"<p>Consider three closed linear subspaces <span>(C_1, C_2,)</span> and <span>(C_3)</span> of a Hilbert space <i>H</i> and the orthogonal projections <span>(P_1, P_2)</span> and <span>(P_3)</span> onto them. Halperin showed that a point in <span>(C_1cap C_2 cap C_3)</span> can be found by iteratively projecting any point <span>(x_0 in H)</span> onto all the sets in a periodic fashion. The limit point is then the projection of <span>(x_0)</span> onto <span>(C_1cap C_2 cap C_3)</span>. Nevertheless, a non-periodic projection order may lead to a non-convergent projection series, as shown by Kopecká, Müller, and Paszkiewicz. This raises the question how many projection orders in <span>({1,2,3}^{mathbb {N}})</span> are “well behaved” in the sense that they lead to a convergent projection series. Melo, da Cruz Neto, and de Brito provided a necessary and sufficient condition under which the projection series converges and showed that the “well behaved” projection orders form a large subset in the sense of having full product measure. We show that also from a topological viewpoint the set of “well behaved” projection orders is a large subset: it contains a dense <span>(G_delta )</span> subset with respect to the product topology. Furthermore, we analyze why the proof of the measure theoretic case cannot be directly adapted to the topological setting.</p>","PeriodicalId":50100,"journal":{"name":"Journal of Optimization Theory and Applications","volume":null,"pages":null},"PeriodicalIF":1.9,"publicationDate":"2024-08-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141935102","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Eric Luxenberg, Dhruv Malik, Yuanzhi Li, Aarti Singh, Stephen Boyd
{"title":"Specifying and Solving Robust Empirical Risk Minimization Problems Using CVXPY","authors":"Eric Luxenberg, Dhruv Malik, Yuanzhi Li, Aarti Singh, Stephen Boyd","doi":"10.1007/s10957-024-02491-6","DOIUrl":"https://doi.org/10.1007/s10957-024-02491-6","url":null,"abstract":"<p>We consider robust empirical risk minimization (ERM), where model parameters are chosen to minimize the worst-case empirical loss when each data point varies over a given convex uncertainty set. In some simple cases, such problems can be expressed in an analytical form. In general the problem can be made tractable via dualization, which turns a min-max problem into a min-min problem. Dualization requires expertise and is tedious and error-prone. We demonstrate how CVXPY can be used to automate this dualization procedure in a user-friendly manner. Our framework allows practitioners to specify and solve robust ERM problems with a general class of convex losses, capturing many standard regression and classification problems. Users can easily specify any complex uncertainty set that is representable via disciplined convex programming (DCP) constraints.</p>","PeriodicalId":50100,"journal":{"name":"Journal of Optimization Theory and Applications","volume":null,"pages":null},"PeriodicalIF":1.9,"publicationDate":"2024-08-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141935014","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Cheik Traoré, Vassilis Apidopoulos, Saverio Salzo, Silvia Villa
{"title":"Variance Reduction Techniques for Stochastic Proximal Point Algorithms","authors":"Cheik Traoré, Vassilis Apidopoulos, Saverio Salzo, Silvia Villa","doi":"10.1007/s10957-024-02502-6","DOIUrl":"https://doi.org/10.1007/s10957-024-02502-6","url":null,"abstract":"<p>In the context of finite sums minimization, variance reduction techniques are widely used to improve the performance of state-of-the-art stochastic gradient methods. Their practical impact is clear, as well as their theoretical properties. Stochastic proximal point algorithms have been studied as an alternative to stochastic gradient algorithms since they are more stable with respect to the choice of the step size. However, their variance-reduced versions are not as well studied as the gradient ones. In this work, we propose the first unified study of variance reduction techniques for stochastic proximal point algorithms. We introduce a generic stochastic proximal-based algorithm that can be specified to give the proximal version of SVRG, SAGA, and some of their variants. For this algorithm, in the smooth setting, we provide several convergence rates for the iterates and the objective function values, which are faster than those of the vanilla stochastic proximal point algorithm. More specifically, for convex functions, we prove a sublinear convergence rate of <i>O</i>(1/<i>k</i>). In addition, under the Polyak-łojasiewicz condition, we obtain linear convergence rates. Finally, our numerical experiments demonstrate the advantages of the proximal variance reduction methods over their gradient counterparts in terms of the stability with respect to the choice of the step size in most cases, especially for difficult problems.</p>","PeriodicalId":50100,"journal":{"name":"Journal of Optimization Theory and Applications","volume":null,"pages":null},"PeriodicalIF":1.9,"publicationDate":"2024-08-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141882639","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Yu Cao, Yuanheng Wang, Habib ur Rehman, Yekini Shehu, Jen-Chih Yao
{"title":"Convergence Analysis of a New Forward-Reflected-Backward Algorithm for Four Operators Without Cocoercivity","authors":"Yu Cao, Yuanheng Wang, Habib ur Rehman, Yekini Shehu, Jen-Chih Yao","doi":"10.1007/s10957-024-02501-7","DOIUrl":"https://doi.org/10.1007/s10957-024-02501-7","url":null,"abstract":"<p>In this paper, we propose a new splitting algorithm to find the zero of a monotone inclusion problem that features the sum of three maximal monotone operators and a Lipschitz continuous monotone operator in Hilbert spaces. We prove that the sequence of iterates generated by our proposed splitting algorithm converges weakly to the zero of the considered inclusion problem under mild conditions on the iterative parameters. Several splitting algorithms in the literature are recovered as special cases of our proposed algorithm. Another interesting feature of our algorithm is that one forward evaluation of the Lipschitz continuous monotone operator is utilized at each iteration. Numerical results are given to support the theoretical analysis.</p>","PeriodicalId":50100,"journal":{"name":"Journal of Optimization Theory and Applications","volume":null,"pages":null},"PeriodicalIF":1.9,"publicationDate":"2024-08-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141882507","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}