Access the full text.
Sign up today, get DeepDyve free for 14 days.
W. Zhou, Li Zhang (2006)
A nonlinear conjugate gradient method based on the MBFGS secant conditionOptimization Methods and Software, 21
M. Arazm, S. Babaie-Kafaki, R. Ghanbari (2017)
An extended Dai-Liao conjugate gradient method with global convergence for nonconvex functionsGlasnik Matematicki, 52
N Andrei (2020)
159Comput Appl Math, 39
Gaohang Yu, Jinhong Huang, Yi Zhou (2010)
A descent spectral conjugate gradient method for impulse noise removalAppl. Math. Lett., 23
N Andrei (2017)
534Optim Methods Softw, 32
N. Andrei (2020)
A double parameter self-scaling memoryless BFGS method for unconstrained optimizationComputational and Applied Mathematics, 39
Gonglin Yuan, Zhan Wang, Pengyuan Li (2020)
A modified Broyden family algorithm with global convergence under a weak Wolfe-Powell line search for unconstrained nonconvex problemsCalcolo, 57
S. Babaie-Kafaki (2013)
A modified scaled memoryless BFGS preconditioned conjugate gradient method for unconstrained optimization4OR, 11
Gonglin Yuan, Mengxiang Zhang, Yingjie Zhou (2022)
Adaptive scaling damped BFGS method without gradient Lipschitz continuityApplied Mathematics Letters
S. Oren, D. Luenberger (1974)
Self-Scaling Variable Metric (SSVM) AlgorithmsManagement Science, 20
Z. Aminifard, S. Babaie-Kafaki, Saeide Ghafoori (2021)
An augmented memoryless BFGS method based on a modified secant equation with application to compressed sensingApplied Numerical Mathematics, 167
I. Livieris, V. Tampakas, P. Pintelas (2018)
A descent hybrid conjugate gradient method based on the memoryless BFGS updateNumerical Algorithms, 79
S. Babaie-Kafaki, Z. Aminifard (2019)
Two–parameter scaled memoryless BFGS methods with a nonmonotone choice for the initial step lengthNumerical Algorithms
MR Arazm (2017)
361Glasnik matematički, 52
J. Dennis, Henry Wolkowicz (1993)
Sizing and least-change secant methodsSIAM Journal on Numerical Analysis, 30
N. Gould, D. Orban, P. Toint (2003)
CUTEr and SifDec: A constrained and unconstrained testing environment, revisitedACM Trans. Math. Softw., 29
Z Aminifard (2021)
187Appl Numer Math, 167
Kaori Sugiki, Yasushi Narushima, H. Yabe (2012)
Globally Convergent Three-Term Conjugate Gradient Methods that Use Secant Conditions and Generate Descent Search Directions for Unconstrained OptimizationJournal of Optimization Theory and Applications, 153
Zengxin Wei, Guoyin Li, L. Qi (2006)
New quasi-Newton methods for unconstrained optimization problemsAppl. Math. Comput., 175
Gonglin Yuan, Junyu Lu, Zhan Wang (2020)
The PRP conjugate gradient algorithm with a modified WWP line search and its application in the image restoration problemsApplied Numerical Mathematics
R. Byrd, J. Nocedal (1989)
A tool for the analysis of Quasi-Newton methods with application to unconstrained minimizationSIAM Journal on Numerical Analysis, 26
SS Oren, E Spedicato (1976)
Optimal conditioning of self-scaling variable metric algorithmsMath. Program., 10
Gonglin Yuan, Zhou Sheng, Bopeng Wang, Wujie Hu, C. Li (2018)
The global convergence of a modified BFGS method for nonconvex functionsJ. Comput. Appl. Math., 327
Wenyu Li, Yan Liu, Jie Yang, Wei Wu (2018)
A New Conjugate Gradient Method with Smoothing $$L_{1/2} $$L1/2 Regularization Based on a Modified Secant Equation for Training Neural NetworksNeural Processing Letters, 48
N. Andrei (2017)
Eigenvalues versus singular values study in conjugate gradient algorithms for large-scale unconstrained optimizationOptimization Methods and Software, 32
Elizabeth Dolan, Jorge Moré (2001)
Benchmarking optimization software with performance profilesMathematical Programming, 91
AR Heravi, GA Hodtani (2018)
A new correntropy-based conjugate gradient backpropagation algorithm for improving training in neural networksIEEE Trans Neural Netw Learn Syst, 29
W Li, Y Liu, J Yang, W Wu (2018)
A new conjugate gradient method with smoothing l1/2\documentclass[12pt]{minimal}Neural Process Lett, 48
Najib Ullah, J. Sabi’u, A. Shah (2021)
A derivative‐free scaling memoryless Broyden–Fletcher–Goldfarb–Shanno method for solving a system of monotone nonlinear equationsNumerical Linear Algebra with Applications, 28
S. Babaie-Kafaki (2016)
A modified scaling parameter for the memoryless BFGS updating formulaNumerical Algorithms, 72
N. Andrei (2020)
New conjugate gradient algorithms based on self-scaling memoryless Broyden–Fletcher–Goldfarb–Shanno methodCalcolo, 57
N Andrei (2020)
1Calcolo, 57
S. Babaie-Kafaki, N. Mirhoseini, Z. Aminifard (2022)
A descent extension of a modified Polak–Ribière–Polyak method with application in image restoration problemOptimization Letters, 17
JZ Zhang, NY Deng, LH Chen (1999)
New quasi-Newton equation and related methods for unconstrained optimizationJ Optim Theory Appl, 102
Ming Li, Hongwei Liu, Zexian Liu (2018)
A new family of conjugate gradient methods for unconstrained optimizationJournal of Applied Mathematics and Computing, 58
Yuhong Dai, C. Kou (2013)
A Nonlinear Conjugate Gradient Algorithm with an Optimal Property and an Improved Wolfe Line SearchSIAM J. Optim., 23
S Wright, J Nocedal (1999)
Numerical optimizationScience, 35
A Liao (1997)
Modifying the BFGS methodOper Res Lett, 20
Based on the augmented version of the quasi-Newton method proposed by Aminifard et al. (App. Num. Math. 167:187–201, 2021), a new scaled parameter of the self-scaling memoryless BFGS update formula is proposed. The idea is to cluster the eigenvalues of the search direction matrix, obtained by minimizing the difference between the largest and the smallest eigenvalues of the matrix. The sufficient descent property is proved for uniformly convex functions, and the global convergence of the proposed algorithm is proved both for the uniformly convex and general nonlinear objective functions. Numerical experiments on a set of test functions of the CUTEr collection show that the proposed method is efficient. In addition, the proposed algorithm is effectively applied to salt and pepper noise elimination problem.
4OR – Springer Journals
Published: May 23, 2023
Keywords: Unconstrained optimization; Self-scaling; Quasi-Newton; Noise elimination problem; 90C34; 90C40
Read and print from thousands of top scholarly journals.
Already have an account? Log in
Bookmark this article. You can see your Bookmarks on your DeepDyve Library.
To save an article, log in first, or sign up for a DeepDyve account if you don’t already have one.
Copy and paste the desired citation format or use the link below to download a file formatted for EndNote
Access the full text.
Sign up today, get DeepDyve free for 14 days.
All DeepDyve websites use cookies to improve your online experience. They were placed on your computer when you launched this website. You can change your cookie settings through your browser.