{"title":"基于新线研究的双近端梯度法求解凸最小化问题及其在数据分类中的应用","authors":"S. Kesornprom, P. Cholamjiak","doi":"10.53006/rna.1143531","DOIUrl":null,"url":null,"abstract":"In this paper, we propose a new proximal gradient method for a convex minimization problem in real Hilbert spaces. We suggest a new linesearch which does not require the condition of Lipschitz constant and improve conditions of inertial term which speed up performance of convergence. Moreover, we prove the weak convergence of the proposed method under some suitable conditions. The numerical implementations in data classification are reported to show its efficiency.","PeriodicalId":36205,"journal":{"name":"Results in Nonlinear Analysis","volume":" ","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2022-12-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"A double proximal gradient method with new linesearch for solving convex minimization problem with application to data classification\",\"authors\":\"S. Kesornprom, P. Cholamjiak\",\"doi\":\"10.53006/rna.1143531\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In this paper, we propose a new proximal gradient method for a convex minimization problem in real Hilbert spaces. We suggest a new linesearch which does not require the condition of Lipschitz constant and improve conditions of inertial term which speed up performance of convergence. Moreover, we prove the weak convergence of the proposed method under some suitable conditions. The numerical implementations in data classification are reported to show its efficiency.\",\"PeriodicalId\":36205,\"journal\":{\"name\":\"Results in Nonlinear Analysis\",\"volume\":\" \",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-12-30\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Results in Nonlinear Analysis\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.53006/rna.1143531\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"Mathematics\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Results in Nonlinear Analysis","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.53006/rna.1143531","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"Mathematics","Score":null,"Total":0}
A double proximal gradient method with new linesearch for solving convex minimization problem with application to data classification
In this paper, we propose a new proximal gradient method for a convex minimization problem in real Hilbert spaces. We suggest a new linesearch which does not require the condition of Lipschitz constant and improve conditions of inertial term which speed up performance of convergence. Moreover, we prove the weak convergence of the proposed method under some suitable conditions. The numerical implementations in data classification are reported to show its efficiency.