Access the full text.
Sign up today, get DeepDyve free for 14 days.
S. Babaie-Kafaki, R. Ghanbari (2014)
A descent family of Dai–Liao conjugate gradient methodsOptimization Methods and Software, 29
J. Nocedal (1992)
Theory of algorithms for unconstrained optimizationActa Numerica, 1
J. Sabi’u, A. Shah, M. Waziri, M. Dauda (2020)
A New Hybrid Approach for Solving Large-scale Monotone Nonlinear EquationsJournal of Mathematical and Fundamental Sciences, 52
E. Spedicato (1994)
Algorithms for continuous optimization : the state of the art
Peiting Gao, Chuanjiang He (2018)
An efficient three-term conjugate gradient method for nonlinear monotone equations with convex constraintsCalcolo, 55
(2001)
Digital Object Identifier (DOI) 10.1007/s101070100263
D. Shanno, K. Phua (1978)
Matrix conditioning and nonlinear optimizationMathematical Programming, 14
Zhifeng Dai, Huan Zhu (2020)
A Modified Hestenes-Stiefel-Type Derivative-Free Method for Large-Scale Nonlinear Monotone EquationsMathematics
J. Dennis, J. Moré (1973)
A Characterization of Superlinear Convergence and its Application to Quasi-Newton MethodsMathematics of Computation, 28
Donghui Li, M. Fukushima (2001)
A modified BFGS method and its global convergence in nonconvex minimizationJournal of Computational and Applied Mathematics, 129
SS Oren, DG Luenberger (1974)
Self‐scaling variable metric (SSVM) algorithms, part I: criteria and sufficient conditions for scaling a class of algorithms, 20
Jin-kui Liu, Yuming Feng (2018)
A derivative-free iterative method for nonlinear monotone equations with convex constraintsNumerical Algorithms, 82
Série Rouge, E. Polak, G. Ribière (1969)
Note sur la convergence de méthodes de directions conjuguées, 3
A. Abubakar, P. Kumam (2019)
A descent Dai-Liao conjugate gradient method for nonlinear equationsNumerical Algorithms, 81
J. Ortega, W. Rheinboldt (2014)
Iterative solution of nonlinear equations in several variables
Wenyu Sun, Ya-xiang Yuan (2006)
Optimization theory and methods
Donghui Li, M. Fukushima (2000)
On the Global Convergence of the BFGS Method for Nonconvex Unconstrained Optimization ProblemsSIAM J. Optim., 11
H. Yabe, H. Martínez, R. Tapia (2004)
On Sizing and Shifting the BFGS Update within the Sized-Broyden Family of Secant UpdatesSIAM J. Optim., 15
Wenyu Sun, Ya-xiang Yuan (2010)
Optimization Theory and Methods: Nonlinear Programming
S. Oren (1973)
Self-scaling variable metric algorithms for unconstrained minimization
W. Zhou, Donghui Li (2008)
A globally convergent BFGS method for nonlinear monotone equations without any merit functionsMath. Comput., 77
E. Zeidler (1989)
Nonlinear Functional Analysis and Its Applications: II/ A: Linear Monotone Operators
M. Contreras, R. Tapia (1993)
Sizing the BFGS and DFP updates: Numerical studyJournal of Optimization Theory and Applications, 78
N. Andrei (2018)
A double parameter scaled BFGS method for unconstrained optimizationJ. Comput. Appl. Math., 332
Jinkui, Liu, Shengjie, Li (2015)
Spectral DY-Type Projection Method for Nonlinear Monotone Systems of EquationsJournal of Computational Mathematics, 33
A. Liao (1997)
Modifying the BFGS methodOper. Res. Lett., 20
Gao Peiting, He Chuanjiang (2018)
A derivative-free three-term projection algorithm involving spectral quotient for solving nonlinear monotone equationsOptimization, 67
J. Sabi’u, A. Shah (2020)
An efficient three-term conjugate gradient-type algorithm for monotone nonlinear equationsRAIRO Oper. Res., 55
R. Fletcher (1994)
An Overview of Unconstrained Optimization
C. Broyden (1970)
The Convergence of a Class of Double-rank Minimization Algorithms 1. General ConsiderationsIma Journal of Applied Mathematics, 6
M. Biggs (1971)
Minimization Algorithms Making Use of Non-quadratic Properties of the Objective FunctionIma Journal of Applied Mathematics, 8
(1976)
Some global convergence properties of a variable metric algorithm for minimization without exact line search
P. Gill, Michael Leonard (2001)
Reduced-Hessian Quasi-Newton Methods for Unconstrained OptimizationSIAM J. Optim., 12
M. Solodov, B. Svaiter (1998)
A Globally Convergent Inexact Newton Method for Systems of Monotone Equations
R. Fletcher (1970)
A New Approach to Variable Metric AlgorithmsComput. J., 13
S. Oren, D. Luenberger (1974)
Self-Scaling Variable Metric (SSVM) AlgorithmsManagement Science, 20
R. Byrd, J. Nocedal, Ya-xiang Yuan (1987)
Global Convergence of a Cass of Quasi-Newton Methods on Convex ProblemsSIAM Journal on Numerical Analysis, 24
MJD Powell (1976)
Nonlinear programming, SIAM‐AMS proceedings, IX
J. Sabi’u, A. Shah, M. Waziri (2020)
Two optimal Hager-Zhang conjugate gradient methods for solving monotone nonlinear equationsApplied Numerical Mathematics, 153
R. Byrd, Dong Liu, J. Nocedal (1992)
On the Behavior of Broyden's Class of Quasi-Newton MethodsSIAM J. Optim., 2
L. Nazareth (1977)
A family of variable metric updatesMathematical Programming, 12
Alfredo Iusemy, M. Solodovy (1997)
Newton-type Methods with Generalized Distances For Constrained OptimizationOptimization, 41
R. Byrd, J. Nocedal (1989)
A tool for the analysis of Quasi-Newton methods with application to unconstrained minimizationSIAM Journal on Numerical Analysis, 26
Yaping Hu, Zengxin Wei (2014)
A modified Liu-Storey conjugate gradient projection algorithm for nonlinear monotone equations, 9
(2002)
A derivative-free scaling memoryless Broyden-Fletcher-Goldfarb-Shanno method for solving a system of monotone nonlinear equations
Weijun, Zhou, Donghui, Li (2007)
LIMITED MEMORY BFGS METHOD FOR NONLINEAR MONOTONE EQUATIONS, 25
Yuhong Dai (2002)
Convergence Properties of the BFGS AlgoritmSIAM J. Optim., 13
D. Goldfarb (1970)
A family of variable-metric methods derived by variational meansMathematics of Computation, 24
M. Biggs (1973)
A Note on Minimization Algorithms which make Use of Non-quardratic Properties of the Objective FunctionIma Journal of Applied Mathematics, 12
J. Sabi’u, A. Shah, M. Waziri, K. Ahmed (2020)
Modified Hager–Zhang conjugate gradient methods via singular value analysis for solving monotone nonlinear equations with convex constraintInternational Journal of Computational Methods
E. Zeidler, L. Boron (1990)
Nonlinear monotone operators
S. Oren, E. Spedicato (1976)
Optimal conditioning of self-scaling variable Metric algorithmsMathematical Programming, 10
N. Andrei (2017)
Eigenvalues versus singular values study in conjugate gradient algorithms for large-scale unconstrained optimizationOptimization Methods and Software, 32
D. Shanno (1970)
Conditioning of Quasi-Newton Methods for Function MinimizationMathematics of Computation, 24
C. Broyden (1970)
The Convergence of a Class of Double-rank Minimization Algorithms 2. The New AlgorithmIma Journal of Applied Mathematics, 6
J. Dennis, J. Moré (1974)
Quasi-Newton Methods, Motivation and TheorySiam Review, 19
W. Zhou, Dongmei Shen (2014)
An Inexact PRP Conjugate Gradient Method for Symmetric Nonlinear EquationsNumerical Functional Analysis and Optimization, 35
Masoud Ahookhosh, K. Amini, Somayeh Bahrami (2013)
Two derivative-free projection approaches for systems of large-scale nonlinear monotone equationsNumerical Algorithms, 64
M. Waziri, K. Ahmed, J. Sabi’u (2019)
A Dai–Liao conjugate gradient method via modified secant equation for system of nonlinear equationsArabian Journal of Mathematics, 9
W. Mascarenhas (2004)
The BFGS method with exact line searches fails for non-convex objective functionsMathematical Programming, 99
N. Andrei (2018)
An adaptive scaled BFGS method for unconstrained optimizationNumerical Algorithms, 77
Yun-Bin Zhao, Duan Li (2000)
Monotonicity of Fixed Point and Normal Mappings Associated with Variational Inequality and Its ApplicationSIAM J. Optim., 11
This paper presents the two‐parameter scaling memoryless Broyden–Fletcher–Goldfarb–Shanno (BFGS) method for solving a system of monotone nonlinear equations. The optimal values of the scaling parameters are obtained by minimizing the measure function involving all the eigenvalues of the memoryless BFGS matrix. The optimal values can be used in the analysis of the quasi‐Newton method for ill‐conditioned matrices. This algorithm can also be described as a combination of the projection technique and memoryless BGFS method. Global convergence of the method is provided. For validation and efficiency of the scheme, some test problems are computed and compared with existing results.
Numerical Linear Algebra With Applications – Wiley
Published: Oct 1, 2021
Keywords: global convergence
Read and print from thousands of top scholarly journals.
Already have an account? Log in
Bookmark this article. You can see your Bookmarks on your DeepDyve Library.
To save an article, log in first, or sign up for a DeepDyve account if you don’t already have one.
Copy and paste the desired citation format or use the link below to download a file formatted for EndNote
Access the full text.
Sign up today, get DeepDyve free for 14 days.
All DeepDyve websites use cookies to improve your online experience. They were placed on your computer when you launched this website. You can change your cookie settings through your browser.