A new non-linear conjugate gradient method based on the Dai-Liao and Kafaki-Ghanbari methods

Based on the Dai-Liao and Kafaki-Ghanbari methods, a new non-linear conjugate gradient method is proposed. Under proper conditions, it is briefly shown that our proposed method possess the descent property and generates conjugate directions. We also show that the suggested method with Wolfe line search conditions is globally convergent. Numerical results illustrates that our suggested method can efficiently solve the test problems and therefore is promising.


Introduction
Recently, due to the features of strong global convergence properties and low memory requirement, conjugate gradient (CG) methods constitute an active choice for efficiently solving the large-scale unconstrained optimization problems [4]. We refer to an excellent survey [8] for a review on recent advances in this area.
Conjugacy condition is an important factor in CG methods. The searching directions in CG methods are often selected in such a way that, when applied to minimize a strongly quadratic convex function, two successive directions are conjugate if no round-off error exists, subject to the Hessian of the quadratic function. That is to say, minimizing a convex quadratic function in a subspace spanned by a set of mutually conjugate directions is equivalent in the sense that one minimizes this function along each conjugate direction in turn. But for the general nonlinear function, the searching directions in most methods fail to satisfy the conjugacy condition. This feature motivates us to solve unconstrained problems by seeking efficient conjugacy conditions [8].
Consider the following unconstrained optimization problem: is a smooth nonlinear function and its gradient ) (x g is available. The iterative formula of a CG method is given by where k d is a search direction updated by is commonly chosen to satisfy certain line search conditions [11]. Among them, the so-called Wolfe conditions have attracted special attention in the convergence analyses and the implementations of CG methods, requiring that The stronger version of the Wolfe line search conditions are (4) and are often imposed on the line search.
More recently, many researchers highlighted two properties in designing new CG methods, the first is the conjugacy condition and the second is the descent property, which play a crucial role in obtaining global convergence and nice actual performance.
In order to accelerate the CG method, the conjugacy condition is often utilized to obtain the order accuracy in the approximation of the curvature of the function as high as possible. By modifying the HS method, Dai and Liao [3] proposed the following Dai-Liao (DL) conjugacy condition

Kirkuk University Journal /Scientific Studies (KUJSS)
The paper is organized as follows. Section 2 describes the suggested method and their properties. In section 3 the global convergence analysis for the proposed method is discussed. Section 4 is devoted to providing numerical results.

Derivation of the new (AK1 say) CG method
The aim of this section is to derive a new conjugate gradient method Aynur and Khalil (AK1 say) by using Dai-Liao and Kafaki-Ghanbari CG methods.

consider the search direction given by Dai -Liao
It is remarkable that numerical performance of the DL method is very dependent on the parameter t for which there is no any optimal choice [1]. It has been attempts to find an ideal value for t . We suggest the following value for t .
Therefore if we substitute the above value for t in the DL method we get the new search direction (AK1) can be defined as follows: We can define the suggested (AK1) algorithm as follows: Step (1): Select a starting point Select some positive values for  and  . Set Step (2): Test for convergence .If , then stop k x is optimal ; otherwise go to step (3).
Step ( Step (4): Update the variables as : Step (5): Compute the search direction as:

Proof:
The proof is by induction.
The proof is complete.

Convergence analysis
Assume the following.
(1) The level set

(2) In a neighborhood N of S the function f is continuously differentiable and its gradient is Lipschitz continuous for all
Under these assumptions on f , there exists a constant 0 Observe that the assumption that the function f is bounded below is weaker than the usual assumption that the level set is bounded.
From the above relation we get:

Numerical results and comparisons
In this section, we report some numerical results on 75 nonlinear unconstrained test problems.
The following CG methods in the form of (2) and (    with respect to the total number of iteration(iter), total number of function and gradient evaluations (fg) and total required for solving 750 test problems.

5.Conclusions
In this paper we have developed a new conjugate gradient method which is based on Dai-Liao and Kafaki-Ghanbari CG methods and generates sufficient descent search direction. Under suitable assumptions our method have been shown to converge globally.