 Research
 Open Access
 Published:
Convergence theorems of subgradient extragradient algorithm for solving variational inequalities and a convex feasibility problem
Fixed Point Theory and Applications volume 2018, Article number: 16 (2018)
Abstract
Let C be a nonempty closed and convex subset of a uniformly smooth and 2uniformly convex real Banach space E with dual space \(E^{*}\). In this paper, a Krasnoselskiitype subgradient extragradient iterative algorithm is constructed and used to approximate a common element of solutions of variational inequality problems and fixed points of a countable family of relatively nonexpansive maps. The theorems proved are improvement of the results of Censor et al. (J. Optim. Theory Appl. 148:318–335, 2011).
Introduction
Let E be a real normed space with dual space \(E^{*}\), and C be a nonempty closed and convex subset of E. The variational inequality problem is to find an element \(v\in C\) such that
where \(f:E\rightarrow E^{*}\). The solution set of this variational inequality problem will be denoted by \(\mathrm{VI}(f, C)\). This problem has numerous applications in many areas of mathematics, such as in partial differential equations, optimal control, optimization, mathematical programming, and some other nonlinear problems (see, for example, [1] and the references contained in them). The map f is called KLipschitz and monotone if
and
respectively, where \(K>0\) is a Lipschitz constant, and is called ηstrongly monotone if there exists \(\eta>0\) such that
In the case that E is a real Hilbert space H, some authors have proposed and analyzed several iterative methods for solving the variational inequality problem (1.1). The simplest of them is the following projection method given by
where f is Lipschitz and ηstrongly monotone with \(\tau\in (0,\frac{2\eta}{K^{2}} )\). Yao et al. [18] showed that the projection gradient method (1.2) may not converge if the strong monotonicity assumption is relaxed to plain monotonicity. To overcome this difficulty, Korpelevich [14] proposed the following extragradient method:
for each \(k\ge1\), which converges if f is monotone and Lipschitz. However, the weakness of this extragradient method is that one needs to calculate two projections onto C in each iteration process. It is known that if C is a general closed and convex set, this iteration process might require a huge amount of computation time. To overcome this difficulty, Censor et al. [6] introduced the subgradient extragradient method given by
replacing one of the projections onto C of the extragradient method by a projection onto a specific constructible subgradient halfspace \(T_{k}\). This projection method has an advantage in computing over the extragradient method proposed by Korpelevich [14] (see, e.g., Censor et al. [5], Dong et al. [9] and the references contained in them). They proved the following theorem in a real Hilbert space.
Theorem 1.1
(Censor et al., [6])
Assume that f is monotone, Lipschitz and \(\mathrm{VI}(f,C)\neq\emptyset\), with \(\tau<\frac{1}{K}\). Then any sequences \(\{x^{k}\}_{k=0}^{\infty}\) and \(\{y^{k}\}_{k=0}^{\infty}\) generated by (1.4) weakly converge to the same solution \(u^{*}\in \mathrm{VI}(f,C)\) and, furthermore, \(u^{*}=\lim_{k\rightarrow\infty} P_{\mathrm{VI}(f,C)}x^{k}\).
In addition, they introduced a modified subgradient extragradient method as follows:
and proved the following theorem in a real Hilbert space.
Theorem 1.2
(Censor et al., [6])
Assume that f is monotone, Lipschitz and \(\mathrm{VI}(f,C)\cap \operatorname{Fix}(S)\neq\emptyset\), with \(\tau<\frac {1}{K}\). Then any sequences \(\{x^{k}\}\) and \(\{y^{k}\}\) generated by (1.5) weakly converge to the same solution \(u^{*}\in \mathrm{VI}(f,C)\cap \operatorname{Fix}(S)\) and, furthermore, \(u^{*}=\lim_{k\rightarrow\infty} P_{\mathrm{VI}(f,C)\cap \operatorname{Fix}(S)}x^{k}\).
Developing algorithms for solving variational inequality problems has continued to attract the interest of numerous researchers in nonlinear operator theory. The reader may see the following important related papers (Gang et al. [11], Anh and Hieu [3], Anh and Hieu [4], Dong et al. [10] and the references contained in them).
Motivated by the result of Censor et al. [6], we propose in this paper a Krasnoselskiitype subgradient extragradient algorithm and prove a weak convergence theorem for obtaining a common element of solutions of variational inequality problems and common fixed points for a countable family of relativelynonexpansive maps in a uniformly smooth and 2uniformly convex real Banach space. Our theorem is an improvement of the result of Censor et al. [6], and a host of other results (see Sect. 5 below).
Methods
The paper is organized as follows. Section 3 contains the preliminaries to include definitions and lemmas with corresponding references that will be used in the sequel. Section 4 contains the main result of the paper. In Sect. 5, we compare our theorems with important recent results in the literature and, thereafter, conclude our findings.
Preliminaries
Let E be a real normed space with dual space \(E^{*}\). We shall denote \(x_{k}\rightharpoonup x^{*}\) and \(x_{k}\rightarrow x^{*}\) to indicate that the sequence \(\{x_{k}\}\) converges weakly to \(x^{*}\) and converges strongly to \(x^{*}\), respectively.
A map \(J: E\rightarrow2^{E^{*}}\) defined by \(J(x):= \{x^{*}\in E^{*}: \langle x,x^{*}\rangle= \Vert x \Vert ^{2}= \Vert x^{*} \Vert ^{2} \}\) is called the normalized duality map on E. The following properties of the duality map will be needed in the sequel (see, e.g., Chidume [7], Cioranescu [8] and the references contained in them):

(1)
If E is a reflexive, strictly convex, and smooth real Banach space, then J is surjective, injective, and singlevalued.

(2)
If E is uniformly smooth, then J is uniformly continuous on a bounded subset of E.

(3)
If \(E=H\), a real Hilbert space, then J is the identity map on H.
Remark 1
J is weakly sequentially continuous if, for any sequence \(\{x_{k}\} \subset E\) such that \(x_{k}\rightharpoonup x^{*}\) as \(k\rightarrow\infty\), then \(Jx_{k}\rightharpoonup Jx^{*}\) as \(k\rightarrow\infty\). It is known that the normalized duality map on \(l_{p}\) spaces, \(1< p<\infty\), is weakly sequentially continuous.
Let E be a smooth real Banach space and \(\phi: E\times E\rightarrow\mathbb{R}\) be a map defined by \(\phi(x,y)= \Vert x \Vert ^{2}2\langle x,Jy\rangle+ \Vert y \Vert ^{2}\) for all \(x,y \in E\). This map was introduced by Alber [1] and has been extensively studied by a host of other authors. It is easy to see from the definition of ϕ that, if \(E=H\), a real Hilbert space, then \(\phi(x,y)= \Vert xy \Vert ^{2}\) for all \(x,y\in H\). Furthermore, for any \(x,y,z\in E\) and \(\beta\in(0,1)\), we have the following properties.
 (\(P_{1}\)):

\(( \Vert x \Vert  \Vert y \Vert )^{2}\le\phi(x,y)\le ( \Vert x \Vert + \Vert y \Vert )^{2}, \forall x,y \in E\).
 (\(P_{2}\)):

\(\phi(x,z)=\phi(x,y)+\phi(y,z)+2\langle yx,JzJy\rangle \).
 (\(P_{3}\)):

\(\phi (x,J^{1}(\beta Jy +(1\beta)Jz )\le \beta\phi (x,y)+(1\beta)\phi(x,z)\).
Definition 3.1
Let C be a nonempty closed and convex subset of a real Banach space E and T be a map from C to E.

(a)
\(x^{*}\) is called an asymptotic fixed point of T if there exists a sequence \(\{x_{k}\}\subset C\) such that \(x_{k}\rightharpoonup x^{*}\) and \(\Vert Tx_{k}x_{k} \Vert \rightarrow0\), as \(k\rightarrow\infty\). We shall denote the set of asymptotic fixed points of T by \(\widehat {F}(T)\).

(b)
T is called relatively nonexpansive if the fixed point set of T is denoted by \(F(T)=\widehat {F}(T)\ne\emptyset\) and \(\phi(p,Tx)\le\phi(p,x)\) for all \(x\in C, p\in F(T)\).
Definition 3.2
(Rockafellar, [16])
The normal cone of C at \(v\in C\) denoted by \(N_{C}(v)\) is given by \(N_{C}(v):=\{w\in E^{*}:\langle yv,w\rangle\le 0, \forall y\in C\}\).
Definition 3.3
A map \(T:E\rightarrow2^{E^{*}}\) is called monotone if \(\langle \eta_{x}\eta_{y},xy\rangle\ge0, \forall x,y \in E\) and \(\eta_{x}\in Tx, \eta_{y}\in Tx\). Furthermore, T is maximal monotone if it is monotone and the graph \(G(T):=\{(x,y)\in E\times E^{*}: y\in T(x)\}\) is not properly contained in the graph of any other monotone operator.
Definition 3.4
A convex feasibility problem is a problem of finding a point in the intersection of convex sets.
Lemma 3.5
(Rockafellar, [16])
Let C be a nonempty closed and convex subset of a reflexive Banach space E. Let \(f:C\rightarrow E^{*}\) be a monotone and hemicontinuous map and \(T\subset E\times E^{*}\) be a map defined by
Then T is maximal monotone and \(0\in Tv\) if and only if \(v\in \mathrm{VI}(f,C)\).
Remark 2
It is known that a monotone map T is maximal if given \((x,y)\in E\times E^{*}\) and if \(\langle xu, yv\rangle\ge0, \forall (u,v)\in G(T)\), then \(y\in Tx\).
Lemma 3.6
(Matsushita and Takahashi, [15])
Let E be a smooth, strictly convex, and reflexive Banach space and C be a nonempty closed convex subset of E. Then the following hold:

(1)
\(\phi (x,\Pi_{C}y) +\phi(\Pi_{C}y,y)\le\phi(x,y), \forall x\in C, y\in E\).

(2)
\(z=\Pi_{C}x\iff\langle zy,JxJz\rangle\ge 0, \forall y\in C\).
Lemma 3.7
(Kamimura and Takahashi, [12])
Let E be a uniformly convex and uniformly smooth real Banach space and \(\{x_{n}\}_{n=1}^{\infty}, \{y_{n}\} _{n=1}^{\infty}\) be sequences in E such that either \(\{x_{n}\} _{n=1}^{\infty}\) or \(\{y_{n}\}_{n=1}^{\infty}\) is bounded. If \(\lim_{n\rightarrow\infty}\phi(x_{n},y_{n})=0\), then \(\lim_{n\rightarrow \infty} \Vert x_{n}y_{n} \Vert =0\).
Lemma 3.8
(Xu, [17])
Let E be a uniformly convex real Banach space. Let \(r>0\). Then there exists a strictly increasing continuous and convex function \(g:[0,\infty)\rightarrow[0,\infty)\) such that \(g(0)=0\) and the following inequality holds:
where \(B_{r}(0):=\{v\in E: \Vert v \Vert \le r\}\) and \(\lambda\in[0,1]\).
Lemma 3.9
(Xu, [17])
Let E be a 2uniformly convex real Banach space. Then there exists a constant \(c_{2}>0\) such that, for every \(x,y\in E\),
Lemma 3.10
(Xu, [17])
Let E be a 2uniformly convex and smooth real Banach space. Then, for any \(x,y\in E\) and for some \(\alpha>0\),
Without loss of generality, we may assume \(\alpha\in(0,1)\).
Lemma 3.11
(Kohsaka and Takahashi, [13])
Let C be a closed convex subset of a uniformly convex and uniformly smooth Banach space E. Let \(T_{i}: C\rightarrow E, i=1,2,\ldots \) , be a countable sequence of relatively nonexpansive maps such that \(\bigcap_{i=1}^{\infty}F(T_{i})\neq\emptyset\). Suppose that \(\{\alpha_{i}\}\subset(0,1)\) and \(\{\beta_{i}\} _{i=1}^{\infty}\subset(0,1)\) are sequences such that \(\sum_{i=1}^{\infty }\alpha _{i}=1\) and \(U: C\rightarrow E\) is defined by
then U is relatively nonexpansive and \(F(U)=\bigcap_{i=1}^{\infty}F(T_{i})\).
Main result
In the sequel, \(\alpha\in(0,1)\) is the constant appearing in Lemma 3.10.
The Krasnoselskiitype subgradient extragradient algorithm
Let E be a uniformly smooth and 2uniformly convex real Banach space with dual space \(E^{*}\). Let C be a nonempty closed and convex subset of E. Let J be the normalized duality maps on E.
Algorithm 1
Let \(\{v_{k}\}\) be a sequence generated iteratively by
If \(v_{k}=y_{k}\), we stop. Otherwise, replace k by \((k+1)\) and return to algorithm.
We shall make the following assumptions.
 \({C_{1}}\) :

The map f is monotone on E.
 \({C_{2}}\) :

The map f is Lipschitz on E, with constant \(K>0\).
 \({C_{3}}\) :

\(\mathrm{VI}(f,C)\neq\emptyset\).
Lemma 4.1
If \(v_{k}=y_{k}\) in Algorithm 1, then \(v_{k}\in \mathrm{VI}(f,C)\).
Proof
If \(v_{k}=y_{k}\), then \(v_{k}= \Pi_{C}J^{1} (Jv_{k}\tau f(v_{k}) )\in C\). Furthermore, by the characterization of the generalized projection onto C, we obtain that
Hence, \(v_{k}\in \mathrm{VI}(f,C)\). □
The following lemma is crucial for the proof of our main theorem.
Lemma 4.2
Let \(\{v_{k}\}_{k=1}^{\infty}\) be the sequence defined in Algorithm 1. Assume conditions \(C_{1}, C_{2}\), and \(C_{3}\) hold with \(\tau\in(0,\frac{\alpha}{K})\). Then, for any \(v\in \mathrm{VI}(f,C)\), the following inequality holds:
Proof
Let \(v\in \mathrm{VI}(f,C)\). Then we have that
Since \(v_{k+1}\in T_{k}\), we have that \(\langle v_{k+1}y_{k},Jv_{k}\tau f(v_{k})Jy_{k} \rangle\le0, \forall k\ge1\). From the above inequality, we obtain that
Set \(Jz_{k}= Jv_{k}\tau f(y_{k})\). Then we compute as follows:
From inequality (4.3) and property \(P_{2}\), it follows that
From inequality (4.4), it follows that
By condition \(C_{3}\) and Lemma 3.10 in the above inequality, it follows that
This completes the proof. □
Theorem 4.3
Let E be a uniformly smooth and 2uniformly convex real Banach space with dual space \(E^{*}\). Let C be a nonempty closed and convex subset of E and \(f:E\rightarrow E^{*}\) be a map satisfying conditions \(C_{1}\) and \(C_{2}\) with \(\tau\in(0,\frac {\alpha }{K})\). Assume that condition \(C_{3}\) holds and J is weakly sequentially continuous on E. Then the sequence \(\{v_{k}\}_{k=1}^{\infty}\) generated iteratively by Algorithm 1 converges weakly to some \(v^{*}\in \mathrm{VI}(f,C)\).
Proof
Since \(\mathrm{VI}(f,C)\neq\emptyset\), let \(v\in \mathrm{VI}(f,C)\). Define \(\gamma :=1\frac{\tau K}{\alpha}\), then \(\gamma\in(0,1)\). By Lemma 4.2, we have that \(\lim_{k\rightarrow\infty}\phi(v,v_{k})\) exists, \(\{\phi(y_{k},v_{k})\}\) is bounded and
Taking limit of both sides of the above inequality, we have that
By Lemma (3.7), \(\lim_{n\rightarrow\infty} \Vert y_{k}v_{k} \Vert =0\).
Next, we show that \(\Omega_{\omega}(v_{k})\subset \mathrm{VI}(f,C)\), where \(\Omega _{\omega}(v_{k})\) is the set of weak subsequential limit of \(\{v_{k}\}\). Let \(x^{*}\in\Omega_{\omega}(v_{k})\) and \(\{v_{k_{j}}\}_{j=1}^{\infty}\) be a subsequence of \(\{v_{k}\}_{k=1}^{\infty}\) such that
Let \(T:E\rightarrow E^{*}\) be a map defined by
where \(N_{C}(v)\) is the normal cone to C at \(v\in C\). Then T is maximal monotone and \({T^{1}(0)=\mathrm{VI}(f,C)}\) (Rockafellar [16]). Let \((v,w)\in G(T)\), where \(G(T)\) is the graph of T. Then \(w\in Tv=fv +N_{C}(v)\). Hence, we get that \(wfv\in N_{C}(v)\). This implies that \(\langle vt,wfv\rangle\ge0, \forall t\in C\). In particular,
Furthermore, \(y_{k}=\Pi_{C}J^{1} (Jv_{k}\tau f(v_{k}) ), \forall k\ge1\). By characterization of the generalized projection map, we obtain that
This implies that
Using inequalities (4.8) and (4.10) for some \(M_{0}> 0\), Cauchy–Schwarz inequality, and condition \(C_{2}\), we have that
Taking limit of both sides of inequality (4.11) and using the fact that J is uniformly continuous on bounded subset of E, we obtain that
Since T is a maximal monotone operator, it follows that \(x^{*}\in T^{1}(0)=\mathrm{VI}(f,C)\), which implies that \(\Omega_{\omega}(v_{k})\subset \mathrm{VI}(f,C)\).
Now, we show that \(v_{k}\rightharpoonup x^{*}\) as \(k\rightarrow\infty\). Define \(x_{k}:=\Pi_{\mathrm{VI}(f,C)}v_{k}\). Then \(\{x_{k}\}\subset \mathrm{VI}(f,C)\). Furthermore, by Lemmas 4.2 and 3.6, we have that
which implies that \(\{\phi(x_{k},v_{k})\}\) converges. From inequality (4.13) and for any \(m>k\), we have that
Furthermore, \(\lim_{k\rightarrow\infty}\phi(x_{k},x_{m})=0\). Hence, by Lemma 3.7, we obtain that \(\lim_{k,m\rightarrow \infty } \Vert x_{k}x_{m} \Vert =0\), which implies that \(\{x_{k}\}\) is a Cauchy sequence in \(\mathrm{VI}(f,C)\). Therefore, there exists \(u^{*}\in \mathrm{VI}(f,C)\) such that \(\lim_{k\rightarrow\infty}x_{k}=u^{*}\).
Now, using the definition of \(x_{k}=\Pi_{\mathrm{VI}(f,C)}v_{k}, \forall k\ge 0\), it follows from Lemma 3.6 that for any \(p\in \mathrm{VI}(f,C)\), we have that
Let \(\{v_{k_{i}}\}\) be any subsequence of \(\{v_{k}\}\). We may assume without loss of generality that \(\{v_{k_{i}}\}\) converges weakly to some \(p^{*}\in \mathrm{VI}(f,C)\). By inequality (4.15), weak sequential continuity of J, and the fact that \(\lim_{k\rightarrow\infty} x_{k}=u^{*}\), we obtain that
However, from the monotonicity of J, we obtain that
Combining inequalities (4.16) and (4.17), we have that
By Lemma 3.9, we obtain that
which implies that \(u^{*}=p^{*}\). Hence, \(v_{k}\rightharpoonup u^{*}=\lim_{k\rightarrow\infty} x_{k}\). This completes the proof. □
The modified Krasnoselskiitype subgradient extragradient algorithm
Algorithm 2
Let \(\{v_{k}\}_{k=1}^{\infty}\) be a sequence generated iteratively by
We shall make the following assumption.
 \({C_{4}}\) :

\(\mathcal{G}:=\mathrm{VI}(f,C)\cap F(S)\neq \emptyset\), \(F(S)\) is the set of fixed points of S.
The following lemma is crucial for the proof of the next theorem.
Lemma 4.4
Let E be a uniformly smooth and 2uniformly convex real Banach space with dual space \(E^{*}\). Let C be a nonempty closed and convex subset of E. Let \(S:E\rightarrow E\) be a relatively nonexpansive map and \(f:E\rightarrow E^{*}\) be a map satisfying conditions \(C_{1}\) and \(C_{2}\) with \(\tau\in(0,\frac{\alpha }{K})\), and let \(\beta\in(0,1)\). Assume that condition \(C_{4}\) holds and J is weakly sequentially continuous on E. Then the sequence \(\{v_{k}\} _{k=1}^{\infty}\) generated iteratively by Algorithm 2 converges weakly to some \(v^{*}\in\mathcal{G}\).
Proof
Denote \(t_{k}=\Pi_{T_{k}}J^{1}(Jv_{k}\tau f(y_{k})), \forall k\ge1\), \(Jz_{k}:=Jv_{k}\tau f(y_{k})\), and \(\gamma =1\frac {\tau K}{\alpha}\).
Since \(\mathcal{G}\neq \emptyset \), let \(u\in\mathcal{G}\). Then we have that
By \(C_{1}, \langle uy_{k}, f(y_{k})f(u)\rangle\le0, \forall k\ge1\). Consequently, \(\langle uy_{k},f(u)\rangle\le0, \forall k\ge1\). Thus, from the last line of the above inequality and by inequality (4.4), we obtain that
By condition \(C_{2}\) and Lemma 3.10, we have that
Applying Lemma 3.8, inequality (4.21), and relative nonexpansivity of S, we obtain that
This implies that \(\lim_{k\rightarrow\infty}\phi (u,v_{k})\) exists. Consequently, \(\{v_{k}\}_{k=1}^{\infty}\) is bounded. From inequality (4.21), \(\{t_{k}\}_{k=1}^{\infty}\) is bounded. Also, from inequality (4.22), we obtain that
From these inequalities, we obtain that
By Lemma 3.7, it follows that \(\lim \Vert y_{k}v_{k} \Vert =0\) and \(\lim \Vert t_{k}y_{k} \Vert =0\). Consequently, we obtain \(\lim_{k\rightarrow\infty} \Vert v_{k}t_{k} \Vert =0\).
Next, we show that \(\Omega_{\omega}(v_{k})\subset\mathcal{G}=F(S)\cap \mathrm{VI}(f,C)\), where \(\Omega_{\omega}(v_{k})\) is the set of weak subsequential limit of \(\{v_{k}\}\). Let \(x^{*}\in\Omega_{\omega}(v_{k})\) and \(\{v_{k_{j}}\} _{j=1}^{\infty}\) be a subsequence of \(\{v_{k}\}_{k=1}^{\infty}\) such that
By definition of S, \(\{St_{k}\}_{k=1}^{\infty}\) is bounded. From inequalities (4.22) and (4.23), we have that
Applying the property of g, we obtain that
By the uniform continuity of \(J^{1}\) on a bounded subset of \(E^{*}\), we get that
so that
which implies that \(Sx^{*}=x^{*}\). Hence, \(x^{*}\in F(S)\).
Next, we show that \(x^{*}\in \mathrm{VI}(f,C)\). Following the same line of argument as in the proof of Theorem 4.3, we have that \(x^{*}\in \mathrm{VI}(f,C)\), and this implies that \(\Omega_{\omega}(v_{k})\subset \mathcal{G} \).
Define \(x_{k}:=\Pi_{\mathcal{G}}v_{k}\). Then \(\{x_{k}\}\subset\mathcal{G}\). Now, following the same line of argument as in the proof of Theorem 4.3, we obtain that \(u^{*}=p^{*}\). Hence, \(v_{k}\rightharpoonup u^{*}=\lim_{k\rightarrow\infty}x_{k}\).This proof is complete. □
A convergence theorem for a convex feasibility problem
In what follows, we shall make the following assumption.
 \({C_{5}}\) :

\(\mathcal{V}:= \bigcap_{i=1}^{\infty}F(T_{i})\cap \mathrm{VI}(f,C)\neq\emptyset\), where \(F(T_{i}):=\{x\in E:T_{i} x=x, \forall i\ge 1\}\).
Theorem 4.5
Let E be a uniformly smooth and 2uniformly convex real Banach space with dual space \(E^{*}\). Let C be a nonempty closed and convex subset of E. Let \({T_{i}:E\rightarrow E,}\) \(i=1,2,\ldots \) , be a countable family of relatively nonexpansive maps and \({f:E\rightarrow E^{*}}\) be a map satisfying conditions \(C_{1}\) and \(C_{2}\) with \(\tau\in(0,\frac {\alpha }{K})\), and let \({\beta\in(0,1)}\). Assume that condition \(C_{5}\) holds and J is weakly sequentially continuous on E.Then the sequence \(\{ v_{k}\}_{k=1}^{\infty}\) generated iteratively by Algorithm 2 converges weakly to some \(v^{*}\in\mathcal{V}\), where
Proof
By Lemma 3.11, S is relatively nonexpansive and \(F(S)=\bigcap_{i=1}^{\infty}F(T_{i})\). Also, by Lemma 4.4, the result of Theorem 4.5 follows. □
Corollary 4.6
Let H be a real Hilbert space, and let C be a nonempty closed and convex subset of H. Let \(T_{i}:H\rightarrow H, i=1,2,\ldots \) , be a countable family of nonexpansive maps and \(f:H\rightarrow H\) be a monotone and KLipschitz map. Let the sequence \(\{v_{k}\}_{k=1}^{\infty}\) be generated iteratively by
Assume that \(C_{1}, C_{2}\), and \(C_{5}\) hold with \(\tau\in (0,\frac{1}{K})\), and let \(\beta\in(0,1)\). Then \(\{v_{k}\}_{k=1}^{\infty}\) converges weakly to \(v^{*}\in\mathcal{V}:= \bigcap_{i=1}^{\infty}F(T_{i})\cap \mathrm{VI}(f,C)\), where \(Sx= (\sum_{i=1}^{\infty}\delta_{i} (\gamma_{i}x+(1\gamma_{i})T_{i}x ) )\), \(\sum_{i=1}^{\infty}\delta_{i}=1\) and \(\{\gamma_{i}\} _{i=1}^{\infty}\subset(0,1)\).
Proof
In a Hilbert space, J is the identity map and \(\phi (y,z)= \Vert yz \Vert ^{2}, \forall y,z\in H\). Thus, the conclusion follows from Theorem 4.5. □
Annotations. The result of Corollary 4.6 is an immediate consequence of Theorem 4.5.
Discussion
All the theorems of this paper are applicable in \(l_{p}\) spaces, \(1< p\le2\), since these spaces are uniformly smooth and 2uniformly convex, and on these spaces, the normalized duality map is weakly sequentially continuous. The analytical representations of the duality map in these spaces, where \({p^{1} + q^{1} =1}\) (see, e.g., Theorem 4.3, Alber and Ryazantseva [2]; p. 36) are:

Theorem 4.3, which approximates a solution of a variational inequality problem, extends Theorem 5.1 of Censor et al. [6] from a Hilbert space to the more general uniformly smooth and 2uniformly convex real Banach space with weakly sequentially continuous duality map.

Theorem 4.5, which approximates a common solution of a variational inequality problem and a common fixed point of a countable family of relatively nonexpansive maps, extends Theorem 7.1 of Censor et al. [6] from a Hilbert space to a uniformly smooth and 2uniformly convex real Banach space with weakly sequentially continuous duality map, and from a single nonexpansive map to a countable family of relatively nonexpansive maps.

The control parameters in Algorithm 2 of Theorem 4.5 are two arbitrarily fixed constants \(\beta\in(0,1)\) and \(\tau \in(0,1)\) which are to be computed once and then used at each step of the iteration process, while the parameters in equation (1.5) studied by Censor et al. [6] are \({\alpha_{k}\in(0,1)}\) and \(\tau\in(0,1)\), and \({\alpha_{k}}\) is to be computed at each step of the iteration process. Consequently, the sequence of Algorithm 2 is of Krasnoselskii type and the sequence defined by equation (1.5) is of Mann type. It is well known that a Krasnoselskiitype sequence converges as fast as a geometric progression, which is slightly better than the convergence rate obtained from any Manntype sequence.
Conclusion
In this paper, we considered Krasnoselskiitype subgradient extragradient algorithms for approximating a common element of solutions of variational inequality problems and fixed points of a countable family of relatively nonexpansive maps in a uniformly smooth and 2uniformly convex real Banach space. A weak convergence of the sequence generated by our algorithm is proved. Furthermore, results obtained are applied in \(l_{p}\)spaces, \(1< p\le2\).
References
 1.
Alber, Y.: Metric and generalized projection operators in Banach spaces: properties and applications. In: Kartsatos, A.G. (ed.) Theory and Applications of Nonlinear Operators of Accretive and Monotone Type, pp. 15–50. Dekker, New York (1996)
 2.
Alber, Y., Ryazantseva, I.: Nonlinear Ill Posed Problems of Monotone Type. Springer, London (2006)
 3.
Anh, P.K., Hieu, D.V.: Parallel and sequential hybrid methods for a finite family of asymptotically quasiϕnonexpansive maps in a uniformly smooth and uniformly convex real Banach space. J. Appl. Math. Comput. (2014). https://doi.org/10.1007/s1219001408016
 4.
Anh, P.K., Hieu, D.V.: Parallel hybrid methods for variational inequalities, equilibrium problems and common fixed point problems. Vietnam J. Math. (2014). https://doi.org/10.1007/s100130150129z
 5.
Censor, Y., Gibali, A., Reich, S.: Two extensions of Korpelevich’s extragradient method for solving the variational inequality problem in Euclidean space. Technical report, (2010)
 6.
Censor, Y., Gibali, A., Reich, S.: The subgradient extragradient method for solving variational inequalities in Hilbert space. J. Optim. Theory Appl. 148, 318–335 (2011)
 7.
Chidume, C.E.: Geometric Properties of Banach Spaces and Nonlinear Iterations. Lecture Notes in Mathematics, vol. 1965. Springer, London (2009)
 8.
Cioranescu, I.: Geometry of Banach Spaces, Duality Mappings and Nonlinear Problems, vol. 62. Kluwer Academic, Norwell (1990)
 9.
Dong, Q.L., Cho, Y.J., Zhong, L.L., Rassias, T.M.: Inertial projection and contraction algorithms for variational inequalities. J. Glob. Optim. (2017). https://doi.org/10.1007/s1089801705060
 10.
Dong, Q.L., Hieu, D.V.: Modified subgradient extragradient method for variational inequality problems. Numer. Algor. https://doi.org/10.1007/s1107501704524
 11.
Gang, C., Gibali, A., Olaniyi, S.I., Shehu, Y.: A new doubleprojection method for solving variational inequalities Banach spaces. J. Optim. Theory Appl. (2018). https://doi.org/10.1007/s1095701812282
 12.
Kamimura, S., Takahashi, W.: Strong convergence of a proximaltype algorithm in a Banach space. SIAM J. Optim. 13(3), 938–945 (2002)
 13.
Kohsaka, F., Takahashi, W.: The set of common fixed points of an infinite family of relatively nonexpansive mappings in Banach and function spaces. In: Proceedings of the International Symposium on Banach and Function Spaces II, pp. 361–373. Yokohama Publishers, Yokohama (2008)
 14.
Korpelevich, G.M.: The extragradient method for finding saddle points and other problems. Ekon. Mat. Metody 12, 747–756 (1976)
 15.
Matsushita, S.Y., Takahashi, W.: A strong convergence theorem for relatively nonexpansive mappings in Banach space. J. Approx. Theory 134, 257–912 (2005)
 16.
Rockafellar, R.T.: On the maximality of sums of nonlinear monotone operators. Trans. Am. Math. Soc. 149, 75–88 (1970)
 17.
Xu, H.K.: Inequalities in Banach spaces with applications. Nonlinear Anal., Theory Methods Appl. 16(12), 1127–1138 (1991)
 18.
Yao, Y., Marino, G., Muglia, L.: A modified Korpelevich’s method convergent to the minimumnorm solution of a variational inequality. Optimization 63, 559–569 (2014)
Acknowledgements
Not applicable.
Availability of data and materials
Data sharing is not applicable to this article.
Funding
This work is supported from ACBF Research Grant Funds to AUST.
Author information
Affiliations
Contributions
All the authors contributed evenly in the writing of this paper. They read and approved the final manuscript.
Corresponding author
Correspondence to C. E. Chidume.
Ethics declarations
Competing interests
The authors declare that they have no conflict of interest.
Additional information
Dedicated to Professor H. K. Xu for his contributions in nonlinear operator theory.
Abbreviations
Not applicable.
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.
About this article
Cite this article
Chidume, C.E., Nnakwe, M.O. Convergence theorems of subgradient extragradient algorithm for solving variational inequalities and a convex feasibility problem. Fixed Point Theory Appl 2018, 16 (2018) doi:10.1186/s1366301806414
Received
Accepted
Published
DOI
MSC
 47H09
 47H10
 47J25
 47J05
 47J20
Keywords
 Subgradient extragradient algorithm
 Variational inequality
 Relatively nonexpansive maps