Get 20M+ Full-Text Papers For Less Than $1.50/day. Subscribe now for You or Your Team.

Learn More →

A double parameter self-scaling memoryless BFGS method for unconstrained optimization

A double parameter self-scaling memoryless BFGS method for unconstrained optimization A double parameter self-scaling memoryless BFGS method for unconstrained optimization is presented. In this method, the first two terms of the self-scaling memoryless BFGS matrix are scaled with a positive parameter, while the third one is scaled with another positive parameter. The first parameter scaling the first two terms is determined to cluster the eigenvalues of the memoryless BFGS matrix. The second parameter scaling the third term is computed as a preconditioner to the Hessian of the minimizing function combined with the minimization of the conjugacy condition to shift the large eigenvalues of the self-scaling memoryless BFGS matrix to the left. The stepsize is determined by the Wolfe line search conditions. The global convergence of this method is proved, assuming that the minimizing function is uniformly convex. The preliminary computational experiments on a set of 80 unconstrained optimization test functions show that this algorithm is more efficient and more robust than the self-scaling BFGS updates by Oren and Luenberger and by Oren and Spedicato. Subject to the CPU time metric, CG-DESCENT is top performer. Comparisons with L-BFGS show that our algorithm is more efficient. http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Computational and Applied Mathematics Springer Journals

A double parameter self-scaling memoryless BFGS method for unconstrained optimization

Computational and Applied Mathematics , Volume 39 (3) – Jun 2, 2020

Loading next page...
 
/lp/springer-journals/a-double-parameter-self-scaling-memoryless-bfgs-method-for-PogpCPNThm

References (24)

Publisher
Springer Journals
Copyright
Copyright © SBMAC - Sociedade Brasileira de Matemática Aplicada e Computacional 2020
ISSN
0101-8205
eISSN
1807-0302
DOI
10.1007/s40314-020-01157-z
Publisher site
See Article on Publisher Site

Abstract

A double parameter self-scaling memoryless BFGS method for unconstrained optimization is presented. In this method, the first two terms of the self-scaling memoryless BFGS matrix are scaled with a positive parameter, while the third one is scaled with another positive parameter. The first parameter scaling the first two terms is determined to cluster the eigenvalues of the memoryless BFGS matrix. The second parameter scaling the third term is computed as a preconditioner to the Hessian of the minimizing function combined with the minimization of the conjugacy condition to shift the large eigenvalues of the self-scaling memoryless BFGS matrix to the left. The stepsize is determined by the Wolfe line search conditions. The global convergence of this method is proved, assuming that the minimizing function is uniformly convex. The preliminary computational experiments on a set of 80 unconstrained optimization test functions show that this algorithm is more efficient and more robust than the self-scaling BFGS updates by Oren and Luenberger and by Oren and Spedicato. Subject to the CPU time metric, CG-DESCENT is top performer. Comparisons with L-BFGS show that our algorithm is more efficient.

Journal

Computational and Applied MathematicsSpringer Journals

Published: Jun 2, 2020

There are no references for this article.