- Research
- Open access
- Published:
Iterative algorithms with regularization for hierarchical variational inequality problems and convex minimization problems
Fixed Point Theory and Applications volumeΒ 2013, ArticleΒ number:Β 284 (2013)
Abstract
In this paper, we consider a variational inequality problem which is defined over the set of intersections of the set of fixed points of a ΞΆ-strictly pseudocontractive mapping, the set of fixed points of a nonexpansive mapping and the set of solutions of a minimization problem. We propose an iterative algorithm with regularization to solve such a variational inequality problem and study the strong convergence of the sequence generated by the proposed algorithm. The results of this paper improve and extend several known results in the literature.
1 Introduction
Let H be a real Hilbert space with the inner product and the norm , let C be a nonempty closed convex subset of H, and let be a convex and continuously FrΓ©chet differentiable functional. We consider the following minimization problem (MP):
We denote by Ξ the set of minimizers of problem (1.1), and we assume that . The gradient-projection algorithm (GPA) is one of the most elegant methods to solve the minimization problem (1.1). The convergence of the sequence generated by the GPA depends on the behavior of the gradient βf. If βf is strongly monotone and Lipschitz continuous, then we get the strong convergence of the sequence generated by the GPA to a unique solution of MP (1.1). However, if the gradient βf is assumed to be only Lipschitz continuous, then the sequence generated by the GPA converges weakly if H is infinite-dimensional (a counterexample is given in [1]). Since the Lipschitz continuity of the gradient βf implies that it is actually inverse strongly monotone (ism) [2], its complement can be an averaged mapping (that is, it can be expressed as a proper convex combination of the identity mapping and a nonexpansive mapping) [1]. Consequently, the GPA can be rewritten as the composite of a projection and an averaged mapping, which is again an averaged mapping. This shows that averaged mappings play an important role in the GPA. Very recently, Xu [1] used averaged mappings to study the convergence analysis of the GPA, which is an operator-oriented approach. He showed that the sequence generated by the GPA converges in norm to a minimizer of MP (1.1), which is also a unique solution of a particular type of variational inequality problem (VIP). It is worth to emphasize that the regularization, in particular the traditional Tikhonov regularization, is usually used to solve ill-posed optimization problems. The advantage of a regularization method is its possible strong convergence to the minimum-norm solution of the optimization problem. In [1], Xu introduced a hybrid gradient-projection algorithm with regularization and proved the strong convergence of the sequence to the minimum-norm solution of MP (1.1). Some iterative algorithms with or without regularization for MP (1.1) are proposed and analyzed in [3β5] for finding a common solution of MP (1.1) and the set of solutions of a nonexpansive mapping.
On the other hand, the theory of variational inequalities [6, 7] has emerged as an important tool to study a wide class of problems from science, engineering, social sciences. If the underlying set in the formulation of a variational inequality problem is a set of fixed points of a mapping or, more precisely, of a nonexpansive mapping, then the variational inequality problem is called hierarchical variational problem. For further details on hierarchical variational inequalities, we refer to [8β11] and the references therein.
In this paper, we consider a variational inequality problem which is defined over the set of intersections of the set of fixed points of a ΞΆ-strictly pseudocontractive mapping, the set of fixed points of a nonexpansive mapping and the set of solutions of MP (1.1). We propose an iterative algorithm with regularization to solve such a variational inequality problem and study the strong convergence of the sequence generated by the proposed algorithm. The results of this paper improve and extend several known results in the literature.
2 Preliminaries and formulations
Throughout the paper, unless otherwise specified, we use the following assumptions and notations. Let H be a real Hilbert space whose inner product and norm are denoted by and , respectively. Let C be a nonempty closed convex subset of H. We write (respectively, ) to indicate that the sequence converges strongly (respectively, weakly) to x. Moreover, we use to denote the weak Ο-limit set of the sequence , that is,
The metric (or nearest point) projection from H onto C is the mapping which assigns to each point the unique point satisfying
Some important properties of projections are gathered in the following proposition.
Proposition 2.1 For given and , we have
-
(a)
, ;
-
(b)
, ;
-
(c)
, , which concludes that is nonexpansive and monotone.
Definition 2.1 A mapping is said to be
-
(a)
ΞΆ-strictly pseudocontractive if there exists a constant such that
If , then it is called nonexpansive;
-
(b)
firmly nonexpansive if is nonexpansive, or equivalently,
alternatively, T is firmly nonexpansive if and only if T can be expressed as
where is a nonexpansive mapping.
It can be easily seen that the projection mappings are firmly nonexpansive. It is clear that is ΞΆ-strictly pseudocontractive if and only if
Definition 2.2 Let T be a nonlinear operator with domain and range .
-
(a)
T is said to be monotone if
-
(b)
Given a number , T is said to be Ξ²-strongly monotone if
-
(c)
Given a number , T is said to be Ξ½-inverse strongly monotone (Ξ½-ism) if
Clearly,
-
if T is nonexpansive, then is monotone;
-
a projection is 1-ism;
-
if T is a ΞΆ-strictly pseudocontractive mapping, then is -inverse strongly monotone.
Definition 2.3 [1]
A mapping is said to be an averaged mapping if it can be written as the average of the identity I and a nonexpansive mapping, that is,
where and is a nonexpansive mapping. More precisely, when the last equality holds, we say that T is Ξ±-averaged. Thus, firmly nonexpansive mappings (in particular, projections) are -averaged maps.
Proposition 2.2 [12]
Let be a given mapping.
-
(a)
T is nonexpansive if and only if the complement is -ism.
-
(b)
If T is Ξ½-ism, then for , Ξ³T is -ism.
-
(c)
T is averaged if and only if the complement is Ξ½-ism for some . Indeed, for , T is Ξ±-averaged if and only if is -ism.
Let be given operators.
-
(a)
If for some and if S is averaged and V is nonexpansive, then T is averaged.
-
(b)
T is firmly nonexpansive if and only if the complement is firmly nonexpansive.
-
(c)
If for some and if S is firmly nonexpansive and V is nonexpansive, then T is averaged.
-
(d)
The composite of finitely many averaged mappings is averaged, that is, if each of the mappings is averaged, then so is the composite . In particular, if is -averaged and is -averaged, where , then the composite is Ξ±-averaged, where .
Lemma 2.1 [[14], Proposition 2.1]
Let C be a nonempty closed convex subset of a real Hilbert space H, and let be a mapping.
-
(a)
If T is a ΞΆ-strictly pseudocontractive mapping, then T satisfies the Lipschitz condition
-
(b)
If T is a ΞΆ-strictly pseudocontractive mapping, then the mapping is semiclosed at 0, that is, if is a sequence in C such that weakly and strongly, then .
-
(c)
If T is a ΞΆ-(quasi-)strict pseudocontraction, then the fixed point set of T is closed and convex so that the projection is well defined.
The following lemma is an immediate consequence of an inner product.
Lemma 2.2 In a real Hilbert space H, we have
The following elementary result on real sequences is quite well known.
Lemma 2.3 [15]
Let be a sequence of nonnegative real numbers such that
where and satisfy the following conditions:
-
(i)
;
-
(ii)
either or ;
-
(iii)
, where , .
Then .
Lemma 2.4 [10]
Let C be a nonempty closed convex subset of a real Hilbert space H, and let be a ΞΆ-strictly pseudocontractive mapping. Let Ξ³ and Ξ΄ be two nonnegative real numbers such that . Then
The following lemma appeared implicitly in the paper of Reineermann [16].
Lemma 2.5 [16]
Let H be a real Hilbert space. Then, for all and ,
Let C be a nonempty closed convex subset of a real Hilbert space H, and let be a monotone mapping. The variational inequality problem (VIP) is to find such that
The solution set of the VIP is denoted by . It is well known that
A set-valued mapping is called monotone if for all , and imply that . A monotone set-valued mapping is called maximal if its graph is not properly contained in the graph of any other monotone set-valued mapping. It is known that a monotone set-valued mapping is maximal if and only if for , for every implies that . Let be a monotone and Lipschitz continuous mapping and be the normal cone to C at , that is,
Define
Lemma 2.6 [17]
Let be a monotone mapping. Then
-
(i)
V is maximal monotone;
-
(ii)
.
Throughout the paper, we denote by and the set of fixed points of T and Ξ, respectively. We also assume that the set is nonempty closed and convex.
Let be nonexpansive mappings and be a ΞΆ-strictly pseudocontractive mapping with . In this paper, we consider and study the following hierarchical variational inequality problem which is defined on .
We denote by Ξ© the solution set of problem (2.1). It is not difficult to verify that solving (2.1) is equivalent to the fixed point problem of finding such that
where stands for the metric projection onto the closed convex set .
Problem (2.1) contains the hierarchical variational inequality problems considered and studied in [8, 18, 19] and the references therein.
By using the definition of the normal cone to , we have the mapping :
and we readily prove that (2.1) is equivalent to the variational inequality
By combining the hybrid gradient-projection method of Xu [1] and a two-step method of Yao et al. [11], we introduce the following three-step iterative algorithm:
where is a Ο-contraction mapping, , and with , . It is proven that under appropriate assumptions, the above iterative sequence converges strongly to an element .
3 Main results
Let us consider the following assumptions:
-
the mapping is a Ο-contraction;
-
the mapping is a ΞΆ-strict pseudocontraction;
-
are two nonexpansive mappings;
-
is Lipschitz continuous with ;
-
is a sequence in with ;
-
, , are sequences in with ;
-
, are sequences in with , ;
-
and , .
Theorem 3.1 Let be a bounded sequence generated from any given by (2.2). Assume that the following conditions hold:
-
(H1)
, ;
-
(H2)
, ;
-
(H3)
and ;
-
(H4)
, ;
-
(H5)
.
Then the following assertions hold:
-
(i)
;
-
(ii)
.
Proof First of all, we show that is ΞΎ-averaged for each , where
Indeed, the Lipschitz continuity of βf implies that βf is -ism [2], that is,
Observe that
Therefore, it follows that is -ism. Thus, by Proposition 2.2(b), is -ism. From Proposition 2.2(c), the complement is -averaged. Therefore, noting that is -averaged and utilizing Proposition 2.3(d), we obtain that for each , is ΞΎ-averaged with
This shows that is nonexpansive. For , utilizing the fact that , we may assume that
Consequently, it follows that for each integer , is -averaged with
This implies that is nonexpansive for all .
The rest of the proof is divided into several steps.
Step 1. .
For simplicity, we put and for every . Then and for every .
Taking into account , without loss of generality, we may assume that for some . We write , , where . It follows that for all ,
Since for all , by Lemma 2.4, we have
Now, we estimate . Observe that for every ,
Similarly, for all , we have
From (2.2), we have
and therefore
which implies that
Also, from (2.2) we have
then simple calculations show that
and thus, from (3.3)-(3.4), we have
where , for some . This together with (3.1)-(3.3) implies that
where , for some .
Further, we observe that
and then by simple calculations, we have
By taking norm and using (3.5)-(3.6), we get
where , for some . Therefore,
where , for some . From (H1)-(H5), it follows that and
Thus, by applying Lemma 2.3 to (3.7), we conclude that
which implies that
Step 2. .
Indeed, let . Then we have
Similarly, we get
By Lemma 2.5 and (3.9), we have
Since for all , utilizing Lemma 2.4, we obtain
Since , we may assume that for some . Therefore, we deduce
Since , , and as , we conclude from the boundedness of , and that as . This together with implies that
Step 3. and .
Let . Then, by Lemmas 2.2 and 2.5, we have
Therefore, we obtain
Since , , , and , from the boundedness of , and , we obtain , and hence
Also, since
from and , it follows that
Furthermore, from the firm nonexpansiveness of , we obtain
and so,
Similarly, we have
Thus, we have
which implies that
Since , and , from the boundedness of , , and , it follows that
In addition, since for all , utilizing Lemma 2.4, we get from (3.12)
which implies that
Since , and , from the boundedness of , and , it follows that
Step 4. .
Let . Then there exists a subsequence of such that . Since
we have
Hence from , and , we get . Since and , we have . By Lemma 2.1(b) (demiclosedness principle), we obtain .
Meanwhile, observe that
Thus,
This together with yields . Since and , we have . By Lemma 2.1(b) (demiclosedness principle), we have .
Further, let us show . Indeed, from and , we have and . Define
where . Then V is maximal monotone and if and only if (see [17]). Let . Then we have
and hence
Therefore, we have
On the other hand, from
we have
and hence
Therefore, from
we have
Hence, we obtain
Since V is maximal monotone, we have , and hence , which leads to . Consequently, . This shows that .
Finally, let us show . Indeed, it follows from (2.2) that for every ,
and hence
Since for all , by Lemma 2.4, we have
which implies that
Since and as , from the boundedness of , and , we deduce that
So, from , we get
Taking into consideration that is monotone and continuous, utilizing Mintyβs lemma [7], we have
Therefore, ; that is, .ββ‘
Remark 3.1 Iterative algorithm (2.2) is different from the algorithms in [1, 11]. The two-step iterative scheme in [11] for two nonexpansive mappings and the gradient-projection iterative schemes in [1] for MP (1.1) are extended to develop three-step iterative scheme (2.2) with regularization for MP (1.1), two nonexpansive mappings and a strictly pseudocontractive mapping.
Remark 3.2 The following sequences satisfy the hypotheses on the parameter in Theorem 3.1.
-
(a)
, and , where and ;
-
(b)
and for all .
Theorem 3.2 Let be the bounded sequence generated from any given by (2.2). Assume that hypotheses (H1)-(H5) of Theorem 3.1 hold and
-
(H6)
;
-
(H7)
There is a constant such that for each , where .
Then the sequences , and converge strongly to provided , where solves the following variational inequality:
Proof Let . From (2.2), we have
and therefore,
Again from (2.2), we obtain
Substituting (3.14) into (3.13), we get
Since for all , utilizing Lemma 2.4, we get from (2.2) and (3.15)
where .
Taking into consideration that is a contractive mapping, we know that has a unique fixed point . That is, there is a unique solution of the following variational inequality problem (VIP):
Since , it is clear that , and hence . Thus, from (3.16), we conclude that
Consider a subsequence of such that
Without loss of generality, we may further assume that . Then, in view of Theorem 3.1, . Since is a unique solution of VIP (3.17) and , we have
which implies that
Meanwhile, from and (H7), we infer that
From (2.2), we have
This together with and implies that
Hence,
Observe that
Therefore, we get
and hence
Thus, it follows that
and hence
Utilizing Lemma 2.3, from and (3.18)-(3.20), we conclude that the sequence converges strongly to . Taking into consideration that and , we obtain that and as . This completes the proof.ββ‘
Remark 3.3 The following parametric sequences satisfy the hypotheses of Theorem 3.2.
-
(a)
, and , where and or , ;
-
(b)
, , .
Remark 3.4 Theorems 3.1 and 3.2 improve, extend, supplement and develop [[11], Theorems 3.1 and 3.2] and [[1], Theorems 5.2 and 6.1] in the following aspects:
-
(a)
Three-step iterative algorithm (2.2) with regularization for MP (1.1), two nonexpansive mappings and a strictly pseudocontractive mapping are more flexible and more subtle than the algorithms in [1, 11].
-
(b)
The argument techniques in Theorems 3.1 and 3.2 are different from the ones in [[11], Theorems 3.1 and 3.2] and the ones in [[1], Theorems 5.2 and 6.1] because we use the properties of strict pseudocontractive mappings and maximal monotone mappings (see, for example, Lemmas 2.1, 2.4 and 2.6).
-
(c)
Compared with the proof of Theorems 5.2 and 6.1 in [1], the proof of Theorems 3.1 and 3.2 shows via the argument of , (see Step 3 in the proof of Theorem 3.1).
-
(e)
Theorems 3.1 and 3.2 remove the condition in [[11], Theorems 3.1 and 3.2].
References
Xu HK: Averaged mappings and the gradient-projection algorithm. J. Optim. Theory Appl. 2011, 150: 360β378. 10.1007/s10957-011-9837-z
Baillon JB, Haddad G: Quelques proprietes des operateurs angle-bornes et n -cycliquement monotones. Isr. J. Math. 1977, 26: 137β150. 10.1007/BF03007664
Ceng LC, Ansari QH, Yao JC: Extragradient-projection method for solving constrained convex minimization problems. Numer. Algebra Control Optim. 2011, 1: 341β359.
Ceng LC, Ansari QH, Wen CF: Implicit relaxed and hybrid methods with regularization for minimization problems and asymptotically strict pseudocontractive mappings in the intermediate sense. Abstr. Appl. Anal. 2013., 2013: Article ID 854297
Ceng LC, Ansari QH, Wen CF: Multi-step implicit iterative methods with regularization for minimization problems and fixed point problems. J. Inequal. Appl. 2013., 2013: Article ID 240
Ansari QH, Lalitha CS, Mehta M: Generalized Convexity, Nonsmooth Variational Inequalities and Nonsmooth Optimization. CRC Press, Boca Raton; 2013.
Kinderlehrer D, Stampacchia G: An Introduction to Variational Inequalities and Their Applications. Academic Press, New York; 1980.
Ceng LC, Ansari QH, Wong NC, Yao JC: Implicit iterative methods for hierarchical variational inequalities. J. Appl. Math. 2012., 2012: Article ID 472935
Cianciaruso F, Colao V, Muglia L, Xu HK: On implicit methods for variational inequalities via hierarchical fixed point approach. Bull. Aust. Math. Soc. 2009, 80: 117β124. 10.1017/S0004972709000082
Yao Y, Liou YC, Kang SM: Approach to common elements of variational inequality problems and fixed point problems via a relaxed extragradient method. Comput. Math. Appl. 2010, 59: 3472β3480. 10.1016/j.camwa.2010.03.036
Yao Y, Liou YC, Marino G: Two-step iterative algorithms for hierarchical fixed point problems and variational inequality problems. J. Appl. Math. Comput. 2009, 31: 433β445. 10.1007/s12190-008-0222-5
Byrne C: A unified treatment of some iterative algorithms in signal processing and image reconstruction. Inverse Probl. 2004, 20: 103β120. 10.1088/0266-5611/20/1/006
Combettes PL: Solving monotone inclusions via compositions of nonexpansive averaged operators. Optimization 2004, 53: 475β504. 10.1080/02331930412331327157
Marino G, Xu HK: Weak and strong convergence theorems for strict pseudo-contractions in Hilbert spaces. J. Math. Anal. Appl. 2007, 329: 336β346. 10.1016/j.jmaa.2006.06.055
Xu HK: Iterative algorithms for nonlinear operators. J. Lond. Math. Soc. 2002, 66: 240β256. 10.1112/S0024610702003332
Reineermann J: Uber fixpunkte kontrahierender abbildungen und schwach konvergente Toeplitz-verfahren. Arch. Math. 1969, 20: 59β64. 10.1007/BF01898992
Rockafellar RT: On the maximality of sums of nonlinear monotone operators. Trans. Am. Math. Soc. 1970, 149: 75β88. 10.1090/S0002-9947-1970-0282272-5
Moudafi A, Mainge PE: Towards viscosity approximations of hierarchical fixed points problems. Fixed Point Theory Appl. 2006., 2006: Article ID 95453
Moudafi A, Mainge PE: Strong convergence of an iterative method for hierarchical fixed point problems. Pac. J.Β Optim. 2007, 3: 529β538.
Acknowledgements
In this research, second and third author were supported by King Fahd University of Petroleum & Minerals project number IN101009. The first author was partially supported by the National Science Foundation of China (11071169), Innovation Program of Shanghai Municipal Education Commission (09ZZ133) and Leading Academic Discipline Project of Shanghai Normal University (DZL707). The research part of third author was done during his visit to King Fahd University of Petroleum & Minerals, Dhahran, Saudi Arabia.
Author information
Authors and Affiliations
Corresponding author
Additional information
Competing interests
The authors declare that they have no competing interests.
Authorsβ contributions
All authors read and approved the final manuscript.
Rights and permissions
Open Access This article is distributed under the terms of the Creative Commons Attribution 2.0 International License ( https://creativecommons.org/licenses/by/2.0 ), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
About this article
Cite this article
Ceng, LC., Al-Homidan, S. & Ansari, Q.H. Iterative algorithms with regularization for hierarchical variational inequality problems and convex minimization problems. Fixed Point Theory Appl 2013, 284 (2013). https://doi.org/10.1186/1687-1812-2013-284
Received:
Accepted:
Published:
DOI: https://doi.org/10.1186/1687-1812-2013-284