Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

A new self-scaling memoryless quasi-Newton update for unconstrained optimization

A new self-scaling memoryless quasi-Newton update for unconstrained optimization Based on the augmented version of the quasi-Newton method proposed by Aminifard et al. (App. Num. Math. 167:187–201, 2021), a new scaled parameter of the self-scaling memoryless BFGS update formula is proposed. The idea is to cluster the eigenvalues of the search direction matrix, obtained by minimizing the difference between the largest and the smallest eigenvalues of the matrix. The sufficient descent property is proved for uniformly convex functions, and the global convergence of the proposed algorithm is proved both for the uniformly convex and general nonlinear objective functions. Numerical experiments on a set of test functions of the CUTEr collection show that the proposed method is efficient. In addition, the proposed algorithm is effectively applied to salt and pepper noise elimination problem. http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png 4OR Springer Journals

A new self-scaling memoryless quasi-Newton update for unconstrained optimization

4OR , Volume OnlineFirst – May 23, 2023

Loading next page...
 
/lp/springer-journals/a-new-self-scaling-memoryless-quasi-newton-update-for-unconstrained-ACEWFYr2jo

References (38)

Publisher
Springer Journals
Copyright
Copyright © The Author(s), under exclusive licence to Springer-Verlag GmbH Germany, part of Springer Nature 2023. Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
ISSN
1619-4500
eISSN
1614-2411
DOI
10.1007/s10288-023-00544-6
Publisher site
See Article on Publisher Site

Abstract

Based on the augmented version of the quasi-Newton method proposed by Aminifard et al. (App. Num. Math. 167:187–201, 2021), a new scaled parameter of the self-scaling memoryless BFGS update formula is proposed. The idea is to cluster the eigenvalues of the search direction matrix, obtained by minimizing the difference between the largest and the smallest eigenvalues of the matrix. The sufficient descent property is proved for uniformly convex functions, and the global convergence of the proposed algorithm is proved both for the uniformly convex and general nonlinear objective functions. Numerical experiments on a set of test functions of the CUTEr collection show that the proposed method is efficient. In addition, the proposed algorithm is effectively applied to salt and pepper noise elimination problem.

Journal

4ORSpringer Journals

Published: May 23, 2023

Keywords: Unconstrained optimization; Self-scaling; Quasi-Newton; Noise elimination problem; 90C34; 90C40

There are no references for this article.