Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

A modified scaled memoryless BFGS preconditioned conjugate gradient method for unconstrained optimization

A modified scaled memoryless BFGS preconditioned conjugate gradient method for unconstrained... In order to propose a scaled conjugate gradient method, the memoryless BFGS preconditioned conjugate gradient method suggested by Shanno and the spectral conjugate gradient method suggested by Birgin and Martínez are hybridized following Andrei’s approach. Since the proposed method is designed based on a revised form of a modified secant equation suggested by Zhang et al., one of its interesting features is applying the available function values in addition to the gradient values. It is shown that, for the uniformly convex objective functions, search directions of the method fulfill the sufficient descent condition which leads to the global convergence. Numerical comparisons of the implementations of the method and an efficient scaled conjugate gradient method proposed by Andrei, made on a set of unconstrained optimization test problems of the CUTEr collection, show the efficiency of the proposed modified scaled conjugate gradient method in the sense of the performance profile introduced by Dolan and Moré. http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png 4OR Springer Journals

A modified scaled memoryless BFGS preconditioned conjugate gradient method for unconstrained optimization

4OR , Volume 11 (4) – Mar 12, 2013

Loading next page...
 
/lp/springer-journals/a-modified-scaled-memoryless-bfgs-preconditioned-conjugate-gradient-nXnYoSIeFs

References (34)

Publisher
Springer Journals
Copyright
Copyright © 2013 by Springer-Verlag Berlin Heidelberg
Subject
Economics / Management Science; Operations Research/Decision Theory; Optimization; Industrial and Production Engineering
ISSN
1619-4500
eISSN
1614-2411
DOI
10.1007/s10288-013-0233-4
Publisher site
See Article on Publisher Site

Abstract

In order to propose a scaled conjugate gradient method, the memoryless BFGS preconditioned conjugate gradient method suggested by Shanno and the spectral conjugate gradient method suggested by Birgin and Martínez are hybridized following Andrei’s approach. Since the proposed method is designed based on a revised form of a modified secant equation suggested by Zhang et al., one of its interesting features is applying the available function values in addition to the gradient values. It is shown that, for the uniformly convex objective functions, search directions of the method fulfill the sufficient descent condition which leads to the global convergence. Numerical comparisons of the implementations of the method and an efficient scaled conjugate gradient method proposed by Andrei, made on a set of unconstrained optimization test problems of the CUTEr collection, show the efficiency of the proposed modified scaled conjugate gradient method in the sense of the performance profile introduced by Dolan and Moré.

Journal

4ORSpringer Journals

Published: Mar 12, 2013

There are no references for this article.