IJPAM: Volume 92, No. 2 (2014)


Tahar Bouali$^1$, Yamina Laskri$^2$
$^{1,2}$Department of Mathematics
Badji Mokhtar University
Annaba 23000, ALGERIA

Abstract. In this paper, an efficient new nonlinear conjugate gradient method is proposed for the unconstrained optimization problems, which possesses the following property: the sufficient descent condition $g_{k}^{T}d_{k}=-\left\Vert g_{k}\right\Vert ^{2}$ holds without any line search. Under the strong Wolf Non-monotone line search, we proved the global convergence of the FR method for strongly convex functions.

The numerical experiments show that the FR method is especially efficient.

Received: December 22, 2013

AMS Subject Classification: 65K10, 90C30

Key Words and Phrases: conjugate gradient, sufficient descent, non-monotone line search, global convergence,unconstrained optimization

Download paper from here.

DOI: 10.12732/ijpam.v92i2.7 How to cite this paper?

International Journal of Pure and Applied Mathematics
ISSN printed version: 1311-8080
ISSN on-line version: 1314-3395
Year: 2014
Volume: 92
Issue: 2
Pages: 225 - 242

CC BY This work is licensed under the Creative Commons Attribution International License (CC BY).