IJPAM: Volume 63, No. 3 (2010)

A NEW HYBRID CONJUGATE GRADIENT METHOD
AND ITS GLOBAL CONVERGENCE FOR
UNCONSTRAINED OPTIMIZATION

Shaogang Li$^1$, Zhongbo Sun$^2$
$^1$School of Mathematics and Computational Science
Guilin University of Electronic Technology
Guilin, 541004, P.R. CHINA
e-mail: [email protected]
$^2$Department of Mathematical Education
College of Humanities and Sciences
Northeast Normal University
Changchun, 130117, P.R. CHINA
e-mail: [email protected]


Abstract.In this paper, a new hybrid conjugate gradient method is proposed for solving unconstrained optimization problems. The parameter $\beta_k$ is computed as a convex combination of $\beta_k^{FR}$ and $\beta_k^{*}$ algorithms, i.e.  $\beta_k^{N}=(1-\theta_k)\beta_k^{FR}+\theta_k\beta_k^{*}$. The parameter $\theta_k$ is computed in such a way so that the direction corresponding to the conjugate gradient algorithm to be the Newton equation  $\nabla^2f(x^{k+1})s^k=y^k$. It is sufficient descent at every iteration. The theoretical analysis shows that the algorithm is global convergence under some suitable conditions. Numerical results show that this new algorithm is effective in unconstrained optimization problems.

Received: April 6, 2010

AMS Subject Classification: 90C30

Key Words and Phrases: hybrid conjugate gradient method, large scale matrix, quasi-Newton matrix, sufficient descent direction

Source: International Journal of Pure and Applied Mathematics
ISSN: 1311-8080
Year: 2010
Volume: 63
Issue: 3