# Strong and Weak Convergence of the Modified Proximal Point Algorithms in Hilbert Space

- Xinkuan Chai
^{1}, - Bo Li
^{2}and - Yisheng Song
^{1}Email author

**2010**:240450

**DOI: **10.1155/2010/240450

© Xinkuan Chai et al. 2010

**Received: **26 October 2009

**Accepted: **10 December 2009

**Published: **12 January 2010

## Abstract

For a monotone operator , we shall show weak convergence of Rockafellar's proximal point algorithm to some zero of and strong convergence of the perturbed version of Rockafellar's to under some relaxed conditions, where is the metric projection from onto . Moreover, our proof techniques are simpler than some existed results.

## 1. Introduction

Throughout this paper, let be a real Hilbert space with inner product and norm , and let be on identity operator in . We shall denote by the set of all positive integers, by the set of all zeros of , that is, and by the set of all fixed points of , that is, . When is a sequence in , then (resp., , ) will denote strong (resp., weak, weak ) convergence of the sequence to .

Let
be an operator with domain
and range
in
. Recall that
is said to be *monotone* if

A monotone operator
is said to be *maximal monotone* if
is monotone and
for all
.

In fact, theory of monotone operator is very important in nonlinear analysis and is connected with theory of differential equations. It is well known (see [1]) that many physically significant problems can be modeled by the initial-value problems of the form

where is a monotone operator in an appropriate space. Typical examples where such evolution equations occur can be found in the heat and wave equations or Schrodinger equations. On the other hand, a variety of problems, including convex programming and variational inequalities, can be formulated as finding a zero of monotone operators. Then the problem of finding a solution with has been investigated by many researchers; see, for example, Bruck [2], Rockafellar [3], Brézis and Lions [4], Reich [5, 6], Nevanlinna and Reich [7], Bruck and Reich [8], Jung and Takahashi [9], Khang [10], Minty [11], Xu [12], and others. Some of them dealt with the weak convergence of (1.4) and others proved strong convergence theorems by imposing strong assumptions on .

One popular method of solving is the proximal point algorithm of Rockafellar [3] which is recognized as a powerful and successful algorithm in finding a zero of monotone operators. Starting from any initial guess , this proximal point algorithm generates a sequence given by

where for all is the resolvent of on the space . Rockafellar's [3] proved the weak convergence of his algorithm (1.3) provided that the regularization sequence remains bounded away from zero and the error sequence satisfies the condition . G ler's example [13] however shows that in an infinite-dimensional Hilbert space, Rochafellar's algorithm (1.3) has only weak convergence. Recently several authors proposed modifications of Rochafellar's proximal point algorithm (1.3) to have strong convergence. For examples, Solodov and Svaiter [14] and Kamimura and Takahashi [15] studied a modified proximal point algorithm by an additional projection at each step of iteration. Lehdili and Moudafi [16] obtained the convergence of the sequence generated by the algorithm

where is viewed as a Tikhonov regularization of . Using the technique of variational distance, Lehdili and Moudafi [16] were able to prove convergence theorems for the algorithm (1.4) and its perturbed version, under certain conditions imposed upon the sequences and . For a maximal monotone operator , Xu [12] and Song and Yang [17] used the technique of nonexpansive mappings to get convergence theorems for defined by the perturbed version of the algorithm (1.4):

In this paper, under more relaxed conditions on the sequences and , we shall show that the sequence generated by (1.5) converges strongly to (where is the metric projection from onto ) and the sequence generated by (1.3) weakly converges to some . Moreover, our proof techniques are simpler than those of Lehdili and Moudafi [16], Xu [12], and Song and Yang [17].

## 2. Preliminaries and Basic Results

Let be a monotone operator with . We use and to denote the resolvent and Yosida's approximation of , respectively. Namely,

For and , the following is well known. For more details, see [18, Pages 369–400] or [3, 19].

(iii) is a single-valued nonexpansive mapping for each (i.e., for all );

In the rest of this paper, it is always assumed that is nonempty so that the metric projection from onto is well defined. It is known that is nonexpansive and characterized by the inequality: given and ; then if and only if

In order to facilitate our investigation in the next section we list a useful lemma.

Lemma 2.1 (see Xu [20, Lemma ]).

where , , and satisfy the conditions (i) (ii) either or (iii) for all and Then converges to zero as .

## 3. Strongly Convergence Theorems

Let be a monotone operator on a Hilbert space . Then is a single-valued nonexpansive mapping from to . When is a nonempty closed convex subset of such that for all (here is closure of ), then we have for and all , and hence the following iteration is well defined

Next we will show strong convergence of defined by (3.1) to find a zero of . For reaching this objective, we always assume in the sequel.

Theorem 3.1.

Let be a monotone operator on a Hilbert space with . Assume that is a nonempty closed convex subset of such that for all and for an anchor point and an initial value , is iteratively defined by (3.1). If and satisfy

then the sequence converges strongly to , where is the metric projection from onto .

Proof.

The proof consists of the following steps:

Step 1.

So, the sequences , , and are bounded.

Step 2.

Step 3.

Step 4.

where So, an application of Lemma 2.1 onto (3.11) yields the desired result.

Theorem 3.2.

Then the sequence converges strongly to , where is the metric projection from onto .

Proof.

From the proof of Theorem 3.1, we can observe that Steps 1, 3 and 4 still hold. So we only need show to Step 2: for each .

Corollary 3.3.

Let be as Theorem 3.1 or 3.2. Suppose that is a maximal monotone operator on and for , is defined by (3.1). Then the sequence converges strongly to , where is the metric projection from onto .

Proof.

Since is a maximal monotone, then is monotone and satisfies the condition for all . Putting , the desired result is reached.

Corollary 3.4.

Let be as Theorem 3.1 or 3.2. Suppose that is a monotone operator on satisfying the condition for all and for , is defined by (3.1). If is convex, then the sequence converges strongly to , where is the metric projection from onto .

Proof.

Taking , following Theorem 3.1 or 3.2, we easily obtain the desired result.

## 4. Weakly Convergence Theorems

For a monotone operator , if for all and , then the iteration ( ) is well defined. Next we will show weak convergence of under some assumptions.

Theorem 4.1.

then the sequence converges weakly to some .

Proof.

As is weakly sequentially compact by the reflexivity of , and hence we may assume that there exists a subsequence of such that . Using the proof technique of Step 3 in Theorem 3.1, we must have that .

Adding up the above two equations, we must have . So, .

In a summary, we have proved that the set is weakly sequentially compact and each cluster point in the weak topology equals to . Hence, converges weakly to . The proof is complete.

Theorem 4.2.

then the sequence converges weakly to some .

Proof.

The remainder of the proof is the same as Theorem 4.1; we omit it.

## Declarations

### Acknowledgments

The authors are grateful to the anonymous referee for his/her valuable suggestions which helps to improve this manuscript. This work is supported by Youth Science Foundation of Henan Normal University(2008qk02) and by Natural Science Research Projects (Basic Research Project) of Education Department of Henan Province (2009B110011, 2009B110001).

## Authors’ Affiliations

## References

- Zeidler E:
*Nonlinear Functional Analysis and Its Applications, Part II: Monotone Operators*. Springer, Berlin, Germany; 1985.View ArticleGoogle Scholar - Bruck RE Jr.:
**A strongly convergent iterative solution of****for a maximal monotone operator in Hilbert space.***Journal of Mathematical Analysis and Applications*1974,**48:**114–126. 10.1016/0022-247X(74)90219-4MathSciNetView ArticleMATHGoogle Scholar - Rockafellar RT:
**Monotone operators and the proximal point algorithm.***SIAM Journal on Control and Optimization*1976,**14**(5):877–898. 10.1137/0314056MathSciNetView ArticleMATHGoogle Scholar - Brézis H, Lions P-L:
**Produits infinis de résolvantes.***Israel Journal of Mathematics*1978,**29**(4):329–345. 10.1007/BF02761171MathSciNetView ArticleMATHGoogle Scholar - Reich S:
**Weak convergence theorems for nonexpansive mappings in Banach spaces.***Journal of Mathematical Analysis and Applications*1979,**67**(2):274–276. 10.1016/0022-247X(79)90024-6MathSciNetView ArticleMATHGoogle Scholar - Reich S:
**Strong convergence theorems for resolvents of accretive operators in Banach spaces.***Journal of Mathematical Analysis and Applications*1980,**75**(1):287–292. 10.1016/0022-247X(80)90323-6MathSciNetView ArticleMATHGoogle Scholar - Nevanlinna O, Reich S:
**Strong convergence of contraction semigroups and of iterative methods for accretive operators in Banach spaces.***Israel Journal of Mathematics*1979,**32**(1):44–58. 10.1007/BF02761184MathSciNetView ArticleMATHGoogle Scholar - Bruck RE, Reich S:
**A general convergence principle in nonlinear functional analysis.***Nonlinear Analysis: Theory, Methods & Applications*1980,**4**(5):939–950. 10.1016/0362-546X(80)90006-1MathSciNetView ArticleMATHGoogle Scholar - Jung JS, Takahashi W:
**Dual convergence theorems for the infinite products of resolvents in Banach spaces.***Kodai Mathematical Journal*1991,**14**(3):358–365. 10.2996/kmj/1138039461MathSciNetView ArticleMATHGoogle Scholar - Khang DB:
**On a class of accretive operators.***Analysis*1990,**10**(1):1–16.MathSciNetView ArticleMATHGoogle Scholar - Minty GJ:
**On the monotonicity of the gradient of a convex function.***Pacific Journal of Mathematics*1964,**14:**243–247.MathSciNetView ArticleMATHGoogle Scholar - Xu H-K:
**A regularization method for the proximal point algorithm.***Journal of Global Optimization*2006,**36**(1):115–125. 10.1007/s10898-006-9002-7MathSciNetView ArticleMATHGoogle Scholar - Güler O:
**On the convergence of the proximal point algorithm for convex minimization.***SIAM Journal on Control and Optimization*1991,**29**(2):403–419. 10.1137/0329022MathSciNetView ArticleMATHGoogle Scholar - Solodov MV, Svaiter BF:
**Forcing strong convergence of proximal point iterations in a Hilbert space.***Mathematical Programming. Series A*2000,**87**(1):189–202.MathSciNetMATHGoogle Scholar - Kamimura S, Takahashi W:
**Strong convergence of a proximal-type algorithm in a Banach space.***SIAM Journal on Optimization*2002,**13**(3):938–945. 10.1137/S105262340139611XMathSciNetView ArticleMATHGoogle Scholar - Lehdili N, Moudafi A:
**Combining the proximal algorithm and Tikhonov regularization.***Optimization*1996,**37**(3):239–252. 10.1080/02331939608844217MathSciNetView ArticleMATHGoogle Scholar - Song Y, Yang C:
**A note on a paper "A regularization method for the proximal point algorithm".***Journal of Global Optimization*2009,**43**(1):171–174. 10.1007/s10898-008-9279-9MathSciNetView ArticleMATHGoogle Scholar - Aubin J-P, Ekeland I:
*Applied Nonlinear Analysis, Pure and Applied Mathematics (New York)*. John Wiley & Sons, New York, NY, USA; 1984:xi+518.Google Scholar - Takahashi W:
*Nonlinear Functional Analysis—Fixed Point Theory and Its Applications*. Yokohama Publishers, Yokohama, Japan; 2000:iv+276.MATHGoogle Scholar - Xu H-K:
**Strong convergence of an iterative method for nonexpansive and accretive operators.***Journal of Mathematical Analysis and Applications*2006,**314**(2):631–643. 10.1016/j.jmaa.2005.04.082MathSciNetView ArticleMATHGoogle Scholar - Liu Q:
**Iterative sequences for asymptotically quasi-nonexpansive mappings with error member.***Journal of Mathematical Analysis and Applications*2001,**259**(1):18–24. 10.1006/jmaa.2000.7353MathSciNetView ArticleMATHGoogle Scholar

## Copyright

This article is published under license to BioMed Central Ltd. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.