A modified conjugate gradient method is presented for solving unconstrained optimization problems, which possesses the following properties: (i) The sufficient descent property is satisfied without any line search; (ii) The search direction will be in a trust region automatically; (iii) The Zoutendijk condition holds for the Wolfe-Powell line search technique; (iv) This method inherits an important property of the well-known Polak-Ribiere-Polyak (PRP) method: the tendency to turn towards the steepest descent direction if a small step is generated away from the solution, preventing a sequence of tiny steps from happening. The global convergence and the linearly convergent rate of the given method are established. Numerical results show that this method is interesting.

A modified BFGS method is proposed for unconstrained optimization. The global convergence and the superlinear convergence of the convex functions are established under suitable assumptions. Numerical results show that this method is interesting.

Abstract It is well known that the search direction plays a main role in the line search method. In this paper, we propose a new search direction together with the Wolfe line search technique and one nonmonotone line search technique for solving unconstrained optimization problems. The given methods possess sufficiently descent property without carrying out any line search rule. The convergent results are established under suitable conditions. For numerical results, analysis of one probability s...

This paper gives a modified PRP method which possesses the global convergence of nonconvex function and the R-linear convergence rate of uniformly convex function. Furthermore, the presented method has sufficiently descent property and characteristic of automatically being in a trust region without carrying out any line search technique. Numerical results indicate that the new method is interesting for the given test problems.

We propose a rank-one fitting method for solving unconstrained optimzation problems. This method possesses some better properties: (a) which can ensure that the undated matrices is symmetric and positive definite; (b) the global convergence for non-convex function is established under suitable conditions. The numerical results are reported.

It is well known that the sufficient descent condition is very important to the global convergence of the nonlinear conjugate gradient method. In this paper, some modified conjugate gradient methods which possess this property are presented. The global convergence of these proposed methods with the weak Wolfe–Powell (WWP) line search rule is established for nonconvex function under suitable conditions. Numerical results are reported.

In this paper,the two-point stepsize gradient method which together with one new nonmonotone line search technique is proposed.We will establish it's global convergence without the Lipschitz continuity on the objective function.

Abstract In this paper, a new conjugate gradient formula β k ∗ is given to compute the search directions for unconstrained optimization problems. General convergence results for the proposed formula with some line searches such as the exact line search, the Wolfe–Powell line search and the Grippo–Lucidi line search are discussed. Under the above line searches and some assumptions, the global convergence properties of the given methods are discussed. The given formula β k ∗ ⩾ 0 , and has the simi...

In this paper, we propose a modified Polak-Ribiere-Polyak (PRP) conjugate gradient method. An attractive property of the proposed method is that the direction generated by the method is always a descent direction for the objective 'function. This property is independent of the line search used. Moreover, if exact line search is used, the method reduces to the ordinary PRP method. Under appropriate conditions, we show that the modified PRP method with Armijo-type line search is globally convergen...

Last. Liqun Qi(PolyU: Hong Kong Polytechnic University)H-Index: 74

view all 3 authors...

Abstract We propose new conjugate gradient formulas for computing the search directions for unconstrained optimization problems. The new formulas turn out to be the conjugate descent formula if exact line searches are made. Some formulas possess the sufficient descent property without any line searches. General convergence results for the proposed formulas with the weak Wolfe–Powell conditions are studied. We prove that some of the formulas with the steplength technique which ensures the Zoutend...

The main work of this paper focuses on identifying the heat flux in inverse problem of two-dimensional nonhomogeneous parabolic equation, which has wide application in the industrial field such as steel-making and continuous casting. Firstly, the existence of the weak solution of the inverse problem is discussed. With the help of forward solution and dual equation, this paper proves the Lipchitz continuity of the cost function and derives the Lipchitz constant. Furthermore, in order to accelerat...

#2Xiaoliang Wang(DUT: Dalian University of Technology)H-Index: 2

Last. Zhou Sheng(NUAA: Nanjing University of Aeronautics and Astronautics)H-Index: 2

view all 3 authors...

There are two problems for nonconvex functions under the weak Wolfe–Powell line search in unconstrained optimization problems. The first one is the global convergence of the Polak–Ribiere–Polyak conjugate gradient algorithm and the second is the global convergence of the Broyden–Fletcher–Goldfarb–Shanno quasi-Newton method. Many scholars have proven that the two problems do not converge, even under an exact line search. Two circle counterexamples were proposed to generate the nonconvergence of t...

#2Xiaoliang Wang(DUT: Dalian University of Technology)H-Index: 2

Last. Zhou Sheng(NUAA: Nanjing University of Aeronautics and Astronautics)H-Index: 2

view all 3 authors...

It is well-known that conjugate gradient algorithms are widely applied in many practical fields, for instance, engineering problems and finance models, as they are straightforward and characterized by a simple structure and low storage. However, challenging problems remain, such as the convergence of the PRP algorithms for nonconvexity under an inexact line search, obtaining a sufficient descent for all conjugate gradient methods, and other theory properties regarding global convergence and the ...

Abstract In this paper, a nonlinear conjugate gradient algorithm is presented. This given algorithm possesses the following properties: (i) the sufficient descent property is satisfied; (ii) the trust region feature holds; (iii) the global convergence is proved for nonconvex functions; (iv) the applications for image restoration problem are done to show the performance of the proposed algorithm. Notice that the properties (i) and (ii) will be obtained without other special conditions.

Nonlinear conjugate gradient (CG) methods are widely used for solving large scale unconstrained optimization problems. Many studies have been devoted to develop and improve these methods. In this paper, we aim to study the global convergence of the BBB conjugate gradient method with exact line search.

Last. Yuning Yang(Xida: Guangxi University)H-Index: 11

view all 3 authors...

Abstract Powell (1984) and Dai (2003) constructed respectively a counterexample to show that the Polak–Ribiere–Polyak (PRP) conjugate gradient algorithm fails to globally converge for nonconvex functions even when the exact line search technique is used, which implies similar failure of the weak Wolfe–Powell (WWP) inexact line search technique. Does another inexact line search technique exist that can ensure global convergence for nonconvex functions? This paper gives a positive answer to this q...