Scientific Journal Of King Faisal University: Basic and Applied Sciences

ع

Scientific Journal of King Faisal University: Basic and Applied Science

A Sufficient Condition for the Global Convergence of Conjugate Gradient Methods for Solving Unconstrained Optimisation Problems

(Osman O.O.Yousif , Awad Abdelrahman , Mogtaba Mohammed , Mohammed A. Saleh )

Abstract

Due to their remarkable convergence properties and performance in practice, conjugate gradient (CG) methods are widely used for solving unconstrained optimisation problems, especially those of large scale. From the 1950s until now, many studies have been carried out to propose new ones to improve existing CG methods. In this paper, we present a condition that guarantees the global convergence of CG methods when they are applied under the exact line search. At the same time, based on this condition, we did a minor modification on the CG methods of Polak-Rebiere-Polyak (PRP) and of Hestenes-Stiefel (HS) to propose new modified methods. Furthermore, to support the theoretical proof of the global convergence of the modified methods in practical computation, a numerical experiment based on comparing the proposed methods with other well-known CG methods was done. It has been found that the new modified methods have the fewest number of iterations and require the shortest time for solving the problems. In addition, they have the highest percentage of the test problems that solved successfully. Hence, we conclude that they can be used successfully for solving unconstrained optimisation problems.
KEYWORDS
Unconstrained optimisation problems; conjugate gradient methods; exact line search; global convergence

PDF

References

Abdelrahman, A., Yousif, O.O.O., Mogtaba, M. and Elbahir, M.K. (2021). Global convergence of nonlinear conjugate gradient coefficients with inexact line search. Scientific Journal of King Faisal University: Basic and Applied Sciences, 22(2), 86–91. DOI: 10.37575/b/sci/210058
Abubakar, A.B., Malik, M., Kumam, P., Mohammad, H., Sun, M., Ibrahim, A.H. and Kiri, A.I. (2022). A Liu-Storey-type conjugate gradient method for unconstrained minimization problem with application in motion control. Journal of King Saud University–Science, 34(4), 101923.
Al-Baali, M. (1985). Descent property and global convergence of the Fletcher—Reeves method with inexact line 
      search. IMA Journal of Numerical Analysis, 5(1), 121–4.
Andrei, N. (2008). An unconstrained optimization test functions collection. Adv. Model. Optim, 10(1), 147–61.
Dolan, E.D. and Moré, J.J. (2002). Benchmarking optimization software with performance profiles. Mathematical programming, 91(2), 201–13.
Dai, Y.H. and Yuan, Y. (1999). A nonlinear conjugate gradient method with a strong global convergence property. SIAM Journal on optimization, 10(1), 177–82.
Dai, Z. (2016). Comments on a new class of nonlinear conjugate gradient coefficients with global convergence properties. Applied Mathematics and Computation, 276(n/a), 297–300.
Fletcher, R. (1980). Practical Methods of Optimization: Vol. 1 Unconstrained Optimization. USA: John Wiley & Sons.
Fletcher, R., and Reeves, C.M. (1964). Function minimization by conjugate gradients. The Computer Journal, 7(2), 149–54.
Gilbert, J.C. and Nocedal, J. (1992). Global convergence properties of conjugate gradient methods for optimization. SIAM Journal on Optimization, 2(1), 21–42.
Hestenes, M.R. and Stiefel, E. (1952). Methods of conjugate gradients for solving. Journal of research of the National Bureau of Standards, 49(6), 409.
Liu, Y. and Storey, C. (1991). Efficient generalized conjugate gradient algorithms, part 1: theory. Journal of Optimization Theory and Applications, 69(1), 129–37.
Rivaie,  M.,  Mamat,  M. and  Abashar,  A.  (2015). A new class of nonlinear conjugate gradient coefficients    with exact  and  inexact  line searches. Applied Mathematics and Computation, 268(n/a), 1152–63.
 Rivaie, M., Mamat, M., June, L.W. and Mohd, I. (2012). A new class of nonlinear conjugate gradient coefficients with global convergence properties. Applied Mathematics and Computation, 218(22), 11323–32.
Polak, E. and Ribiere, G. (1969). Note on the convergence of conjugate direction methods. Mathematical Modeling and Numerical Analysis, 3(1), 35–43.
Polyak, B.T. (1969). The conjugate gradient method in extremal problems. USSR Computational Mathematics and Mathematical Physics, 9(4), 94–112.
Powell, M.J. (1984). Nonconvex minimization calculations and the conjugate gradient method. In Lecture Notes in Mathematics, 1066(n/a), 122–41.
Wei, Z., Li, G. and Qi, L. (2006). New nonlinear conjugate gradient formulas for large-scale unconstrained optimization problems. Applied Mathematics and Computation, 179(2), 407–30.
Wolfe, P. (1969). Convergence conditions for ascent methods. SIAM Review, 11(2), 226–35.
Wolfe, P. (1971). Convergence conditions for ascent methods. II: Some corrections. SIAM review, 13(2), 185–8.
Wei, Z., Yao, S. and Liu, L. (2006). The convergence properties of some new conjugate gradient methods. Applied Mathematics and Computation, 183(2), 1341–50.
Yousif, O.O.O. (2020). The convergence properties of RMIL+ conjugate gradient method under the strong Wolfe line search. Applied Mathematics and Computation, 367(n/a), 124777.
Yuan, G., Wei, Z. and Lu, X. (2017). Global convergence of BFGS and PRP methods under a modified weak Wolfe–Powell line search. Applied Mathematical Modelling, 47(n/a), 811–25.
Zoutendijk, G. (1970). Nonlinear programming, computational methods. In: J. Abadie (ed.) Integer and Nonlinear Programming. Amsterdam: North Holllad.