 Research
 Open Access
Convergence theorems of subgradient extragradient algorithm for solving variational inequalities and a convex feasibility problem
 C. E. Chidume^{1}Email author and
 M. O. Nnakwe^{1}
https://doi.org/10.1186/s1366301806414
© The Author(s) 2018
 Received: 29 December 2017
 Accepted: 18 May 2018
 Published: 18 June 2018
Abstract
Let C be a nonempty closed and convex subset of a uniformly smooth and 2uniformly convex real Banach space E with dual space \(E^{*}\). In this paper, a Krasnoselskiitype subgradient extragradient iterative algorithm is constructed and used to approximate a common element of solutions of variational inequality problems and fixed points of a countable family of relatively nonexpansive maps. The theorems proved are improvement of the results of Censor et al. (J. Optim. Theory Appl. 148:318–335, 2011).
Keywords
 Subgradient extragradient algorithm
 Variational inequality
 Relatively nonexpansive maps
MSC
 47H09
 47H10
 47J25
 47J05
 47J20
1 Introduction
Theorem 1.1
(Censor et al., [6])
Assume that f is monotone, Lipschitz and \(\mathrm{VI}(f,C)\neq\emptyset\), with \(\tau<\frac{1}{K}\). Then any sequences \(\{x^{k}\}_{k=0}^{\infty}\) and \(\{y^{k}\}_{k=0}^{\infty}\) generated by (1.4) weakly converge to the same solution \(u^{*}\in \mathrm{VI}(f,C)\) and, furthermore, \(u^{*}=\lim_{k\rightarrow\infty} P_{\mathrm{VI}(f,C)}x^{k}\).
Theorem 1.2
(Censor et al., [6])
Assume that f is monotone, Lipschitz and \(\mathrm{VI}(f,C)\cap \operatorname{Fix}(S)\neq\emptyset\), with \(\tau<\frac {1}{K}\). Then any sequences \(\{x^{k}\}\) and \(\{y^{k}\}\) generated by (1.5) weakly converge to the same solution \(u^{*}\in \mathrm{VI}(f,C)\cap \operatorname{Fix}(S)\) and, furthermore, \(u^{*}=\lim_{k\rightarrow\infty} P_{\mathrm{VI}(f,C)\cap \operatorname{Fix}(S)}x^{k}\).
Developing algorithms for solving variational inequality problems has continued to attract the interest of numerous researchers in nonlinear operator theory. The reader may see the following important related papers (Gang et al. [11], Anh and Hieu [3], Anh and Hieu [4], Dong et al. [10] and the references contained in them).
Motivated by the result of Censor et al. [6], we propose in this paper a Krasnoselskiitype subgradient extragradient algorithm and prove a weak convergence theorem for obtaining a common element of solutions of variational inequality problems and common fixed points for a countable family of relativelynonexpansive maps in a uniformly smooth and 2uniformly convex real Banach space. Our theorem is an improvement of the result of Censor et al. [6], and a host of other results (see Sect. 5 below).
2 Methods
The paper is organized as follows. Section 3 contains the preliminaries to include definitions and lemmas with corresponding references that will be used in the sequel. Section 4 contains the main result of the paper. In Sect. 5, we compare our theorems with important recent results in the literature and, thereafter, conclude our findings.
3 Preliminaries
Let E be a real normed space with dual space \(E^{*}\). We shall denote \(x_{k}\rightharpoonup x^{*}\) and \(x_{k}\rightarrow x^{*}\) to indicate that the sequence \(\{x_{k}\}\) converges weakly to \(x^{*}\) and converges strongly to \(x^{*}\), respectively.
 (1)
If E is a reflexive, strictly convex, and smooth real Banach space, then J is surjective, injective, and singlevalued.
 (2)
If E is uniformly smooth, then J is uniformly continuous on a bounded subset of E.
 (3)
If \(E=H\), a real Hilbert space, then J is the identity map on H.
Remark 1
J is weakly sequentially continuous if, for any sequence \(\{x_{k}\} \subset E\) such that \(x_{k}\rightharpoonup x^{*}\) as \(k\rightarrow\infty\), then \(Jx_{k}\rightharpoonup Jx^{*}\) as \(k\rightarrow\infty\). It is known that the normalized duality map on \(l_{p}\) spaces, \(1< p<\infty\), is weakly sequentially continuous.
 (\(P_{1}\)):

\(( \Vert x \Vert  \Vert y \Vert )^{2}\le\phi(x,y)\le ( \Vert x \Vert + \Vert y \Vert )^{2}, \forall x,y \in E\).
 (\(P_{2}\)):

\(\phi(x,z)=\phi(x,y)+\phi(y,z)+2\langle yx,JzJy\rangle \).
 (\(P_{3}\)):

\(\phi (x,J^{1}(\beta Jy +(1\beta)Jz )\le \beta\phi (x,y)+(1\beta)\phi(x,z)\).
Definition 3.1
 (a)
\(x^{*}\) is called an asymptotic fixed point of T if there exists a sequence \(\{x_{k}\}\subset C\) such that \(x_{k}\rightharpoonup x^{*}\) and \(\Vert Tx_{k}x_{k} \Vert \rightarrow0\), as \(k\rightarrow\infty\). We shall denote the set of asymptotic fixed points of T by \(\widehat {F}(T)\).
 (b)
T is called relatively nonexpansive if the fixed point set of T is denoted by \(F(T)=\widehat {F}(T)\ne\emptyset\) and \(\phi(p,Tx)\le\phi(p,x)\) for all \(x\in C, p\in F(T)\).
Definition 3.2
(Rockafellar, [16])
The normal cone of C at \(v\in C\) denoted by \(N_{C}(v)\) is given by \(N_{C}(v):=\{w\in E^{*}:\langle yv,w\rangle\le 0, \forall y\in C\}\).
Definition 3.3
A map \(T:E\rightarrow2^{E^{*}}\) is called monotone if \(\langle \eta_{x}\eta_{y},xy\rangle\ge0, \forall x,y \in E\) and \(\eta_{x}\in Tx, \eta_{y}\in Tx\). Furthermore, T is maximal monotone if it is monotone and the graph \(G(T):=\{(x,y)\in E\times E^{*}: y\in T(x)\}\) is not properly contained in the graph of any other monotone operator.
Definition 3.4
A convex feasibility problem is a problem of finding a point in the intersection of convex sets.
Lemma 3.5
(Rockafellar, [16])
Remark 2
It is known that a monotone map T is maximal if given \((x,y)\in E\times E^{*}\) and if \(\langle xu, yv\rangle\ge0, \forall (u,v)\in G(T)\), then \(y\in Tx\).
Lemma 3.6
(Matsushita and Takahashi, [15])
 (1)
\(\phi (x,\Pi_{C}y) +\phi(\Pi_{C}y,y)\le\phi(x,y), \forall x\in C, y\in E\).
 (2)
\(z=\Pi_{C}x\iff\langle zy,JxJz\rangle\ge 0, \forall y\in C\).
Lemma 3.7
(Kamimura and Takahashi, [12])
Let E be a uniformly convex and uniformly smooth real Banach space and \(\{x_{n}\}_{n=1}^{\infty}, \{y_{n}\} _{n=1}^{\infty}\) be sequences in E such that either \(\{x_{n}\} _{n=1}^{\infty}\) or \(\{y_{n}\}_{n=1}^{\infty}\) is bounded. If \(\lim_{n\rightarrow\infty}\phi(x_{n},y_{n})=0\), then \(\lim_{n\rightarrow \infty} \Vert x_{n}y_{n} \Vert =0\).
Lemma 3.8
(Xu, [17])
Lemma 3.9
(Xu, [17])
Lemma 3.10
(Xu, [17])
Lemma 3.11
(Kohsaka and Takahashi, [13])
4 Main result
In the sequel, \(\alpha\in(0,1)\) is the constant appearing in Lemma 3.10.
4.1 The Krasnoselskiitype subgradient extragradient algorithm
Let E be a uniformly smooth and 2uniformly convex real Banach space with dual space \(E^{*}\). Let C be a nonempty closed and convex subset of E. Let J be the normalized duality maps on E.
Algorithm 1
 \({C_{1}}\) :

The map f is monotone on E.
 \({C_{2}}\) :

The map f is Lipschitz on E, with constant \(K>0\).
 \({C_{3}}\) :

\(\mathrm{VI}(f,C)\neq\emptyset\).
Lemma 4.1
If \(v_{k}=y_{k}\) in Algorithm 1, then \(v_{k}\in \mathrm{VI}(f,C)\).
Proof
The following lemma is crucial for the proof of our main theorem.
Lemma 4.2
Proof
Theorem 4.3
Let E be a uniformly smooth and 2uniformly convex real Banach space with dual space \(E^{*}\). Let C be a nonempty closed and convex subset of E and \(f:E\rightarrow E^{*}\) be a map satisfying conditions \(C_{1}\) and \(C_{2}\) with \(\tau\in(0,\frac {\alpha }{K})\). Assume that condition \(C_{3}\) holds and J is weakly sequentially continuous on E. Then the sequence \(\{v_{k}\}_{k=1}^{\infty}\) generated iteratively by Algorithm 1 converges weakly to some \(v^{*}\in \mathrm{VI}(f,C)\).
Proof
4.2 The modified Krasnoselskiitype subgradient extragradient algorithm
Algorithm 2
 \({C_{4}}\) :

\(\mathcal{G}:=\mathrm{VI}(f,C)\cap F(S)\neq \emptyset\), \(F(S)\) is the set of fixed points of S.
The following lemma is crucial for the proof of the next theorem.
Lemma 4.4
Let E be a uniformly smooth and 2uniformly convex real Banach space with dual space \(E^{*}\). Let C be a nonempty closed and convex subset of E. Let \(S:E\rightarrow E\) be a relatively nonexpansive map and \(f:E\rightarrow E^{*}\) be a map satisfying conditions \(C_{1}\) and \(C_{2}\) with \(\tau\in(0,\frac{\alpha }{K})\), and let \(\beta\in(0,1)\). Assume that condition \(C_{4}\) holds and J is weakly sequentially continuous on E. Then the sequence \(\{v_{k}\} _{k=1}^{\infty}\) generated iteratively by Algorithm 2 converges weakly to some \(v^{*}\in\mathcal{G}\).
Proof
Denote \(t_{k}=\Pi_{T_{k}}J^{1}(Jv_{k}\tau f(y_{k})), \forall k\ge1\), \(Jz_{k}:=Jv_{k}\tau f(y_{k})\), and \(\gamma =1\frac {\tau K}{\alpha}\).
Next, we show that \(x^{*}\in \mathrm{VI}(f,C)\). Following the same line of argument as in the proof of Theorem 4.3, we have that \(x^{*}\in \mathrm{VI}(f,C)\), and this implies that \(\Omega_{\omega}(v_{k})\subset \mathcal{G} \).
Define \(x_{k}:=\Pi_{\mathcal{G}}v_{k}\). Then \(\{x_{k}\}\subset\mathcal{G}\). Now, following the same line of argument as in the proof of Theorem 4.3, we obtain that \(u^{*}=p^{*}\). Hence, \(v_{k}\rightharpoonup u^{*}=\lim_{k\rightarrow\infty}x_{k}\).This proof is complete. □
4.3 A convergence theorem for a convex feasibility problem
 \({C_{5}}\) :

\(\mathcal{V}:= \bigcap_{i=1}^{\infty}F(T_{i})\cap \mathrm{VI}(f,C)\neq\emptyset\), where \(F(T_{i}):=\{x\in E:T_{i} x=x, \forall i\ge 1\}\).
Theorem 4.5
Proof
By Lemma 3.11, S is relatively nonexpansive and \(F(S)=\bigcap_{i=1}^{\infty}F(T_{i})\). Also, by Lemma 4.4, the result of Theorem 4.5 follows. □
Corollary 4.6
Proof
In a Hilbert space, J is the identity map and \(\phi (y,z)= \Vert yz \Vert ^{2}, \forall y,z\in H\). Thus, the conclusion follows from Theorem 4.5. □
Annotations. The result of Corollary 4.6 is an immediate consequence of Theorem 4.5.
5 Discussion

Theorem 4.3, which approximates a solution of a variational inequality problem, extends Theorem 5.1 of Censor et al. [6] from a Hilbert space to the more general uniformly smooth and 2uniformly convex real Banach space with weakly sequentially continuous duality map.

Theorem 4.5, which approximates a common solution of a variational inequality problem and a common fixed point of a countable family of relatively nonexpansive maps, extends Theorem 7.1 of Censor et al. [6] from a Hilbert space to a uniformly smooth and 2uniformly convex real Banach space with weakly sequentially continuous duality map, and from a single nonexpansive map to a countable family of relatively nonexpansive maps.

The control parameters in Algorithm 2 of Theorem 4.5 are two arbitrarily fixed constants \(\beta\in(0,1)\) and \(\tau \in(0,1)\) which are to be computed once and then used at each step of the iteration process, while the parameters in equation (1.5) studied by Censor et al. [6] are \({\alpha_{k}\in(0,1)}\) and \(\tau\in(0,1)\), and \({\alpha_{k}}\) is to be computed at each step of the iteration process. Consequently, the sequence of Algorithm 2 is of Krasnoselskii type and the sequence defined by equation (1.5) is of Mann type. It is well known that a Krasnoselskiitype sequence converges as fast as a geometric progression, which is slightly better than the convergence rate obtained from any Manntype sequence.
6 Conclusion
In this paper, we considered Krasnoselskiitype subgradient extragradient algorithms for approximating a common element of solutions of variational inequality problems and fixed points of a countable family of relatively nonexpansive maps in a uniformly smooth and 2uniformly convex real Banach space. A weak convergence of the sequence generated by our algorithm is proved. Furthermore, results obtained are applied in \(l_{p}\)spaces, \(1< p\le2\).
Notes
Declarations
Acknowledgements
Not applicable.
Availability of data and materials
Data sharing is not applicable to this article.
Funding
This work is supported from ACBF Research Grant Funds to AUST.
Authors’ contributions
All the authors contributed evenly in the writing of this paper. They read and approved the final manuscript.
Competing interests
The authors declare that they have no conflict of interest.
Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.
Authors’ Affiliations
References
 Alber, Y.: Metric and generalized projection operators in Banach spaces: properties and applications. In: Kartsatos, A.G. (ed.) Theory and Applications of Nonlinear Operators of Accretive and Monotone Type, pp. 15–50. Dekker, New York (1996) Google Scholar
 Alber, Y., Ryazantseva, I.: Nonlinear Ill Posed Problems of Monotone Type. Springer, London (2006) MATHGoogle Scholar
 Anh, P.K., Hieu, D.V.: Parallel and sequential hybrid methods for a finite family of asymptotically quasiϕnonexpansive maps in a uniformly smooth and uniformly convex real Banach space. J. Appl. Math. Comput. (2014). https://doi.org/10.1007/s1219001408016 Google Scholar
 Anh, P.K., Hieu, D.V.: Parallel hybrid methods for variational inequalities, equilibrium problems and common fixed point problems. Vietnam J. Math. (2014). https://doi.org/10.1007/s100130150129z MATHGoogle Scholar
 Censor, Y., Gibali, A., Reich, S.: Two extensions of Korpelevich’s extragradient method for solving the variational inequality problem in Euclidean space. Technical report, (2010) Google Scholar
 Censor, Y., Gibali, A., Reich, S.: The subgradient extragradient method for solving variational inequalities in Hilbert space. J. Optim. Theory Appl. 148, 318–335 (2011) MathSciNetView ArticleMATHGoogle Scholar
 Chidume, C.E.: Geometric Properties of Banach Spaces and Nonlinear Iterations. Lecture Notes in Mathematics, vol. 1965. Springer, London (2009) MATHGoogle Scholar
 Cioranescu, I.: Geometry of Banach Spaces, Duality Mappings and Nonlinear Problems, vol. 62. Kluwer Academic, Norwell (1990) View ArticleMATHGoogle Scholar
 Dong, Q.L., Cho, Y.J., Zhong, L.L., Rassias, T.M.: Inertial projection and contraction algorithms for variational inequalities. J. Glob. Optim. (2017). https://doi.org/10.1007/s1089801705060 MATHGoogle Scholar
 Dong, Q.L., Hieu, D.V.: Modified subgradient extragradient method for variational inequality problems. Numer. Algor. https://doi.org/10.1007/s1107501704524
 Gang, C., Gibali, A., Olaniyi, S.I., Shehu, Y.: A new doubleprojection method for solving variational inequalities Banach spaces. J. Optim. Theory Appl. (2018). https://doi.org/10.1007/s1095701812282 Google Scholar
 Kamimura, S., Takahashi, W.: Strong convergence of a proximaltype algorithm in a Banach space. SIAM J. Optim. 13(3), 938–945 (2002) MathSciNetView ArticleMATHGoogle Scholar
 Kohsaka, F., Takahashi, W.: The set of common fixed points of an infinite family of relatively nonexpansive mappings in Banach and function spaces. In: Proceedings of the International Symposium on Banach and Function Spaces II, pp. 361–373. Yokohama Publishers, Yokohama (2008) Google Scholar
 Korpelevich, G.M.: The extragradient method for finding saddle points and other problems. Ekon. Mat. Metody 12, 747–756 (1976) MathSciNetMATHGoogle Scholar
 Matsushita, S.Y., Takahashi, W.: A strong convergence theorem for relatively nonexpansive mappings in Banach space. J. Approx. Theory 134, 257–912 (2005) MathSciNetView ArticleMATHGoogle Scholar
 Rockafellar, R.T.: On the maximality of sums of nonlinear monotone operators. Trans. Am. Math. Soc. 149, 75–88 (1970) MathSciNetView ArticleMATHGoogle Scholar
 Xu, H.K.: Inequalities in Banach spaces with applications. Nonlinear Anal., Theory Methods Appl. 16(12), 1127–1138 (1991) MathSciNetView ArticleMATHGoogle Scholar
 Yao, Y., Marino, G., Muglia, L.: A modified Korpelevich’s method convergent to the minimumnorm solution of a variational inequality. Optimization 63, 559–569 (2014) MathSciNetView ArticleMATHGoogle Scholar