Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

A descent hybrid conjugate gradient method based on the memoryless BFGS update

A descent hybrid conjugate gradient method based on the memoryless BFGS update In this work, we present a new hybrid conjugate gradient method based on the approach of the convex hybridization of the conjugate gradient update parameters of DY and HS+, adapting a quasi-Newton philosophy. The computation of the hybrization parameter is obtained by minimizing the distance between the hybrid conjugate gradient direction and the self-scaling memoryless BFGS direction. Furthermore, a significant property of our proposed method is that it ensures sufficient descent independent of the accuracy of the line search. The global convergence of the proposed method is established provided that the line search satisfies the Wolfe conditions. Our numerical experiments on a set of unconstrained optimization test problems from the CUTEr collection indicate that our proposed method is preferable and in general superior to classic conjugate gradient methods in terms of efficiency and robustness. http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Numerical Algorithms Springer Journals

A descent hybrid conjugate gradient method based on the memoryless BFGS update

Loading next page...
 
/lp/springer-journals/a-descent-hybrid-conjugate-gradient-method-based-on-the-memoryless-vHlwdlbQq0

References (49)

Publisher
Springer Journals
Copyright
Copyright © 2018 by Springer Science+Business Media, LLC, part of Springer Nature
Subject
Computer Science; Numeric Computing; Algorithms; Algebra; Theory of Computation; Numerical Analysis
ISSN
1017-1398
eISSN
1572-9265
DOI
10.1007/s11075-018-0479-1
Publisher site
See Article on Publisher Site

Abstract

In this work, we present a new hybrid conjugate gradient method based on the approach of the convex hybridization of the conjugate gradient update parameters of DY and HS+, adapting a quasi-Newton philosophy. The computation of the hybrization parameter is obtained by minimizing the distance between the hybrid conjugate gradient direction and the self-scaling memoryless BFGS direction. Furthermore, a significant property of our proposed method is that it ensures sufficient descent independent of the accuracy of the line search. The global convergence of the proposed method is established provided that the line search satisfies the Wolfe conditions. Our numerical experiments on a set of unconstrained optimization test problems from the CUTEr collection indicate that our proposed method is preferable and in general superior to classic conjugate gradient methods in terms of efficiency and robustness.

Journal

Numerical AlgorithmsSpringer Journals

Published: Feb 1, 2018

There are no references for this article.