{"title":"一类新的不精确行搜索的记忆梯度方法","authors":"Zhenjun Shi","doi":"10.1515/1569395054069008","DOIUrl":null,"url":null,"abstract":"The paper presents a new class of memory gradient methods with inexact line searches for unconstrained minimization problems. The methods use more previous iterative information than other methods to generate a search direction and use inexact line searches to select a step-size at each iteration. It is proved that the new methods have global convergence under weak mild conditions. The convergence rate of these methods is also investigated under some special cases. Some numerical experiments show that these new algorithms converge more stably than other line search methods and are effective in solving large scale unconstrained minimization problems.","PeriodicalId":342521,"journal":{"name":"J. Num. Math.","volume":"36 3 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2005-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"A new class of memory gradient methods with inexact line searches\",\"authors\":\"Zhenjun Shi\",\"doi\":\"10.1515/1569395054069008\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The paper presents a new class of memory gradient methods with inexact line searches for unconstrained minimization problems. The methods use more previous iterative information than other methods to generate a search direction and use inexact line searches to select a step-size at each iteration. It is proved that the new methods have global convergence under weak mild conditions. The convergence rate of these methods is also investigated under some special cases. Some numerical experiments show that these new algorithms converge more stably than other line search methods and are effective in solving large scale unconstrained minimization problems.\",\"PeriodicalId\":342521,\"journal\":{\"name\":\"J. Num. Math.\",\"volume\":\"36 3 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2005-04-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"J. Num. Math.\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1515/1569395054069008\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"J. Num. Math.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1515/1569395054069008","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
A new class of memory gradient methods with inexact line searches
The paper presents a new class of memory gradient methods with inexact line searches for unconstrained minimization problems. The methods use more previous iterative information than other methods to generate a search direction and use inexact line searches to select a step-size at each iteration. It is proved that the new methods have global convergence under weak mild conditions. The convergence rate of these methods is also investigated under some special cases. Some numerical experiments show that these new algorithms converge more stably than other line search methods and are effective in solving large scale unconstrained minimization problems.