Three-terms conjugate gradient algorithm based on the Dai-Liao and the Powell symmetric methods

Based on the Dai-Laio and Powell symmetric methods, we developed a new three – term conjugate gradient method for solving large-scale unconstrained optimization problem. The suggested method satisfies both the descent condition and the conjugacy condition. For uniformly convex function, under standard assumption the global convergence of the algorithm is proved. Finally, some numerical results of the proposed method are given.


Introduction
Conjugate Gradient (CG) method comprise a class of unconstrained optimization algorithms characterized by low memory requirements and strong global convergence properties [3] which made them popular for engineers and mathematicians engaged in solving large-scale problems in the following form: is a smooth nonlinear function and its gradient is available. The iterative formula of a CG method is given by  (2) in which k  is a step-length to be computed by a line search procedure and k d is the search direction defined by  is a parameter called the conjugacy condition . The step-length k  is usually chosen to satisfy certain line search conditions [15].  (4) and Is needed to ensure the convergence and to enhance the stability.
The pure conjugacy condition is represented by [11] the form for nonlinear conjugate gradient methods. The extension of the conjugacy condition was studied by Perry [11]. He tried to accelerate the conjugate gradient method by incorporating the second-order information into it. Specifically, he used the secant condition  (9) By (8) and (9), the relation Holds. By taking this relation into account, Perry replaced the conjugacy condition (7) by the condition Dai and Liao [5] generalized the condition (10) to the following where 0  t is a scalar. The case 0  t , (11) reduces to the usual conjugacy condition (7). On the other hand, the case 1  t , (11) becomes Perry's condition (10). To ensure that the search direction k d satisfies condition (11), by substituting This gives the Dai-Liao formula We note that the case 1  t reduces to the Perry formula Furthermore, if 0  t , then DL  reduces to the HS  . The approach of Dai and Liao (DL) has been paid special attention to by many researches. In several efforts, modified secant equations have been applied to make modifications on the DL method. It is remarkable that numerical performance of the DL method is very dependent on the parameter t for which there is no any optimal choice [2] .
This paper is organized as follows. In section 2 we briefly review the Three-terms conjugate gradient methods. In section 3, the proposed algorithm is stated. The properties and convergent results of the new method are given in in Section 4. Numerical results and conclusion are presented in Section 5 and in Section 6, respectively.

Three-terms conjugate gradient (CG) methods
Recently many researchers have been studied three-term conjugate gradient methods. For example Narushima, Yab and Ford [10] have proposed a wider class of three term conjugate gradient methods (called 3TCG) which always satisfy the sufficient descent condition. Shanno in [14] used the well-known BFGS quasi-Newton method to obtain the following three-term CG method.
Furthermore, Liu and Xu in [9] was generalized the Perry conjugate gradient algorithm (13), the search directions were formulated as follows

A modifying three-terms conjugate gradient (CG) method
The aim of this section is to develop a modified three-terms conjugate gradient method named ( AKTCG say ) by using Powell Symmetric (PS) method (15) and Dai and Liao (DL) CG method (3) and (12). consider the search direction given by Dai (17) we get 11 11 Now equating the equations (15) and (18) Note that, if line search is exact i.e In the following we summarize the our AKTCG algorithm.
Step (4): Update the variables as : Step (5): Compute the search direction as: Step (

Convergence analysis
Assume the following.

The level set
Under these assumptions on f , there exists a constant Observe that the assumption that the function f is bounded below is weaker than the usual assumption that the level set is bounded. Although the search directions generated by (20) are always descent directions, to ensure convergence of the algorithm we need to constrain the choice of the step length k  . The following proposition shows that the Wolfe line search always gives a lower bound for the step length k  . Based on the above assumptions we shall show that our method satisfies the conjugacy condition, the sufficient descent condition, and globally convergent with Wolfe line search conditions. In the following (theorems 1,2) we will prove that our algorithm satisfies the sufficient descent condition and conjugacy condition.  , for all k .
Proof: The proof is by induction.
for all x on the line segment connecting (24)

Numerical results and comparisons
In this section, we report some numerical results obtained with an implementation of the AKTCG algorithm. The code of the AKTCG Algorithm is written in Fortran and compiled with f77 (default compiler settings), taken from N. Andrei web page. We selected 80 Large-scale unconstrained optimization test functions in the generalized or extended form presented in [1]. For each test  (1), (2) and (3) we see that AKTCG is the top performer.  In this paper, we have proposed a three -term conjugate gradient method based on the DL and PS methods which generates sufficient descent and conjugate directions. Our method have been shown to converge globally. In numerical experiments, we have confirmed the effectiveness of the proposed method by using performance profile.