 Research
 Open Access
 Published:
Compositions and convex combinations of asymptotically regular firmly nonexpansive mappings are also asymptotically regular
Fixed Point Theory and Applications volume 2012, Article number: 53 (2012)
Abstract
Because of Minty's classical correspondence between firmly nonexpansive mappings and maximally monotone operators, the notion of a firmly nonexpansive mapping has proven to be of basic importance in fixed point theory, monotone operator theory, and convex optimization. In this note, we show that if finitely many firmly nonexpansive mappings defined on a real Hilbert space are given and each of these mappings is asymptotically regular, which is equivalent to saying that they have or "almost have" fixed points, then the same is true for their composition. This significantly generalizes the result by Bauschke from 2003 for the case of projectors (nearest point mappings). The proof resides in a Hilbert product space and it relies upon the BrezisHaraux range approximation result. By working in a suitably scaled Hilbert product space, we also establish the asymptotic regularity of convex combinations.
2010 Mathematics Subject Classification: Primary 47H05, 47H09; Secondary 47H10, 90C25.
1 Introduction and standing assumptions
Throughout this article,
and induced norm  ⋅ . We assume that
Recall that an operator T: X → X is firmly nonexpansive (see, e.g., [1–3] for further information) if (∀x ∈ X)(∀y ∈ X) Tx  Ty^{2} ≤ 〈x  y, Tx  Ty〉 and that a setvalued operator A: X ⇉ X is maximally monotone if it is monotone, i.e., for all (x, x*) and (y, y*) in the graph of A, we have 〈x  y, x*  y*〉 ≥ 0 and if the graph of A cannot be properly enlarged without destroying monotonicity (We shall write dom A = { x ∈ X  Ax ≠ Ø} for the domain of A, ran A = A(X) = ∪_{x∈X}Ax for the range of A, and gr A for the graph of A.) These notions are equivalent (see [4, 5]) in the sense that if A is maximally monotone, then its resolvent J_{ A } : = (Id + A)^{1} is firmly nonexpansive, and if T is firmly nonexpansive, then T^{1} Id is maximally monotone. (Here and elsewhere, Id denotes the identity operator on X.) The Minty parametrization (see [4] and also [[1], Remark 23.22(ii)]) states that if A is maximally monotone, then
In optimization, one main problem is to find zeros of maximally monotone operators these zeros may correspond to critical points or solutions to optimization problems. In terms of resolvents, the corresponding problem is that of finding fixed points. For background material in fixed point theory and monotone operator theory, we refer the reader to [1–3, 6–16].
The aim of this note is to provide approximate fixed point results for compositions and convex combinations of finitely many firmly nonexpansive operators.
The first main result (Theorem 4.6) substantially extends a result by Bauschke [17] on the compositions of projectors to the composition of firmly nonexpansive mappings. The second main result (Theorem 5.5) extends a result by Bauschke, Moffat and Wang [18] on the convex combination of firmly nonexpansive operators from Euclidean to Hilbert space.
The remainder of this section provides the standing assumptions used throughout the article.
Even though the main results are formulated in the given Hilbert space X, it will turn out that the key space to work in is the product space
This product space contains an embedding of the original space X via the diagonal subspace
We also assume that we are given m firmly nonexpansive operators T_{1},..., T_{ m }; equivalently, m resolvents of maximally monotone operators A_{1},..., A_{ m } :
We now define various pertinent operators acting on X^{m} . We start with the Cartesian product operators
and
Denoting the identity on X^{m} by Id, we observe that
Of central importance will be the cyclic rightshift operator
and for convenience we set
We also fix strictly positive convex coefficients (or weights) (λ_{ i } )_{i∈I}, i.e.,
Let us make X^{m} into the Hilbert product space
The orthogonal complement of Δ with respect to this standard inner product is known (see, e.g., [[1], Proposition 25.4(i)]) to be
Finally, given a nonempty closed convex subset C of X, the projector (nearest point mapping) onto C is denoted by P_{ C } . It is well known to be firmly nonexpansive.
2 Properties of the operator M
In this section, we collect several useful properties of the operator M, including its MoorePenrose inverse (see [19] and e.g., [[1], Section 3.2] for further information.). To that end, the following resultwhich is probably part of the folklorewill turn out to be useful.
Proposition 2.1 Let Y be a real Hilbert space and let B be a continuous linear operator from X to Y with adjoint B* and such that ran B is closed. Then the MoorePenrose inverse of B satisfies
Proof Take y ∈ Y. Define the corresponding set of least squares solutions (see, e.g., [[1], Proposition 3.25]) by C : = B^{1}(P_{ran B}y). Since ran B is closed, so is ran B* (see, e.g., [[1], Corollary 15.34]); hence,^{a}$U\mathsf{\text{:}}={\left(\mathsf{\text{Ker}}\phantom{\rule{2.77695pt}{0ex}}B\right)}^{\perp}=\overline{\mathsf{\text{ran}}\phantom{\rule{2.77695pt}{0ex}}{B}^{*}}=\mathsf{\text{ran}}\phantom{\rule{2.77695pt}{0ex}}{B}^{*}$. Thus, C = B^{†}y + ker B = B^{†}y + U^{┴} . Therefore, since ran B^{†} = ran B* (see, e.g., [[1], Proposition 3.28(v)]), P_{ U } (C) = P_{ U }B^{†}y = B^{†}y, as claimed.
Before we present various useful properties of M, let us recall the notion of a rectangular (which is also known as star or 3* monotone, see [20]) operator. A monotone operator B: X ⇉ X is rectangular if $\left(\forall \left(x,{y}^{*}\right)\in \mathsf{\text{dom}}\phantom{\rule{0.3em}{0ex}}B\times \mathsf{\text{ran}}\phantom{\rule{0.3em}{0ex}}B\right){\text{sup}}_{\left(z,{z}^{*}\right)\in \mathsf{\text{gr}}\phantom{\rule{0.3em}{0ex}}B}\u3008xz,{z}^{*}y\u3009<+\infty $.
Theorem 2.2 Define ^{b}
Then the following hold.

(i)
M is continuous, linear, and maximally monotone with dom M = X.

(ii)
M is rectangular.

(iii)
ker M = ker M* = Δ.

(iv)
ran M = ran M* = Δ^{┴} is closed.

(v)
ran L = Δ^{┴}.

(vi)
$\mathbf{M}\circ \mathbf{L}=\mathbf{I}\mathbf{d}{}_{{\Delta}^{\perp}}$.

(vii)
${\mathbf{M}}^{1}:\mathbf{X}\rightrightarrows \mathbf{X}:\mathbf{y}\mapsto \left\{\begin{array}{cc}\mathbf{L}\mathbf{y}+\Delta ,\hfill & \mathsf{\text{if}}\phantom{\rule{2.77695pt}{0ex}}\mathbf{y}\in {\Delta}^{\perp};\hfill \\ \varnothing ,\hfill & \mathsf{\text{otherwise}}\mathsf{\text{.}}\hfill \end{array}\right.$

(viii)
${\mathbf{M}}^{\u2020}={P}_{{\Delta}^{\perp}}\circ \mathbf{L}\circ {P}_{{\Delta}^{\perp}}=\mathbf{L}\circ {P}_{{\Delta}^{\perp}}$.

(ix)
${\mathbf{M}}^{\u2020}=\sum _{k=1}^{m}\frac{m\left(2k1\right)}{2m}{\mathbf{R}}^{k1}$.
Proof. (i): Clearly, dom M = X and (∀x ∈ X) Rx = x. Thus, R is nonexpansive and therefore M = Id R is maximally monotone (see, e.g., [[1], Example 20.27]).
(ii): See [[1], Example 24.14] and [[17], Step 3 in the proof of Theorem 3.1] for two different proofs of the rectangularity of M.
(iii): The definitions of M and R and the fact that R* is the cyclic left shift operator readily imply that ker M = ker M* = Δ.
(iv), (vi), and (vii): Let y = (y_{1}, ..., y_{ m } ) ∈ X. Assume first that y ∈ ran M. Then there exists x = (x_{1}, ..., x_{ m } ) such that y_{1} = x_{1}  x_{ m }, y_{2}= x_{2}  x_{1}, ..., and y_{ m }= x_{ m } x_{m1}. It follows that ∑_{i∈I}y_{ i }= 0, i.e, y ∈ Δ^{┴}by [[1], Proposition 25.4(i)]. Thus,
Conversely, assume now that y ∈ Δ^{┴}. Now set
It will be notationally convenient to wrap indices around, i.e., y_{m + 1}= y_{1}, y_{0}= y_{ m } and likewise. We then get
Therefore,
Thus x ∈ Δ^{┴} and
Furthermore,
Hence Mx = x  Rx = y and thus y ∈ ran M. Moreover, in view of (iii),
We thus have shown
Combining (17) and (24), we obtain ran M = Δ^{┴}. We thus have verified (vi), and (vii). Since ran M is closed, so is ran M* (by, e.g., [[1], Corollary 15.34]). Thus (iv) holds.
(viii)&(v): We have seen in Proposition 2.1 that
Now let z ∈ X. Then, by (iv), $\mathbf{y}:={P}_{\mathsf{\text{ran}}{\mathsf{\text{M}}}^{\mathbf{Z}}}={P}_{{\Delta}^{\perp \mathbf{Z}}}\in {\Delta}^{\perp}$. By (vii), M^{1}y = Ly + Δ. So ${\mathbf{M}}^{\u2020}\mathbf{z}={P}_{\mathsf{\text{ran}}\phantom{\rule{2.77695pt}{0ex}}\mathbf{M}*}{\mathbf{M}}^{1}{P}_{\mathsf{\text{ran}}\phantom{\rule{2.77695pt}{0ex}}\mathbf{M}}\mathbf{z}={P}_{\mathsf{\text{ran}}\phantom{\rule{2.77695pt}{0ex}}\mathbf{M}\mathsf{\text{*}}}{\mathbf{M}}^{1}\mathbf{y}={P}_{{\Delta}^{\perp}}\left(\mathbf{L}\mathbf{y}+\Delta \right)={P}_{{\Delta}^{\perp}}\mathbf{L}\mathbf{y}=\mathbf{L}\mathbf{y}=\left(\mathbf{L}\circ {P}_{{\Delta}^{\perp}}\right)\mathbf{z}$ because ran L ⊆ Δ^{┴} by (21). Hence (viii) holds. Furthermore, by (iv) and e.g., [[1], Proposition 3.28(v)],$\mathsf{\text{ran}}\phantom{\rule{2.77695pt}{0ex}}\mathbf{L}=\mathsf{\text{ran}}\phantom{\rule{2.77695pt}{0ex}}\mathbf{L}\circ {P}_{{\Delta}^{\perp}}=\mathsf{\text{ran}}\phantom{\rule{2.77695pt}{0ex}}{\mathbf{M}}^{\u2020}=\mathsf{\text{ran}}\phantom{\rule{2.77695pt}{0ex}}{\mathbf{M}}^{*}={\Delta}^{\perp}$ and so (v) holds.
(ix): Note that ${P}_{{\Delta}^{\perp}}=\mathbf{I}\mathbf{d}{P}_{\Delta}$ and that P_{Δ} = m^{1}∑_{j∈I}R^{j}. Hence
Thus, by (viii) and (16),
Rearranging this expression in terms of powers of R and simplifying leads to
Remark 2.3 Suppose that $\stackrel{\u0303}{\mathbf{L}}:{\Delta}^{\perp}\to \mathbf{X}$ satisfies $\mathbf{M}\circ \stackrel{\u0303}{\mathbf{L}}=\mathbf{I}\mathbf{d}{}_{{\Delta}^{\perp}}$ Then
One may show that ${\mathbf{M}}^{\u2020}={P}_{{\Delta}^{\perp}}\circ \stackrel{\u0303}{\mathbf{L}}\circ {P}_{{\Delta}^{\perp}}$ and that ${P}_{{\Delta}^{\perp}}\circ \stackrel{\u0303}{\mathbf{L}}=\mathbf{L}$ (see (16)). Concrete choices for $\stackrel{\u0303}{\mathbf{L}}$ and L are
however, the range of the latter operator is not equal Δ^{┴} whenever X ≠ {0}.
Remark 2.4 Denoting the symmetric part of M by ${\mathbf{M}}_{+}\phantom{\rule{2.77695pt}{0ex}}\phantom{\rule{2.77695pt}{0ex}}=\frac{1}{2}\mathbf{M}+\frac{1}{2}{\mathbf{M}}^{*}$ and defining the quadratic form associated with M by ${q}_{\mathsf{\text{M}}}:\mathsf{\text{x}}\to \frac{\mathsf{\text{1}}}{\mathsf{\text{2}}}\u3008\mathsf{\text{x,}}\phantom{\rule{2.77695pt}{0ex}}\mathbf{M}x\u3009$, we note that [[17], Proposition 2.3] implies that^{c}$\mathsf{\text{ran}}\phantom{\rule{2.77695pt}{0ex}}{\mathbf{M}}_{+}=\mathsf{\text{dom}}{q}_{\mathbf{M}}^{*}={\Delta}^{\perp}$.
Fact 2.5 (BrezisHaraux) (See [20] and also, e.g., [[1], Theorem 24.20].) Suppose A and B are monotone operators on X such that A + B is maximally monotone, dom A ⊆ dom B, and B is rectangular. Then int ran(A + B) = int(ran A + ran B) and$\overline{\text{ran(}A+B)}=\overline{\text{ran}\phantom{\rule{0.1em}{0ex}}A+\text{ran}\phantom{\rule{0.1em}{0ex}}B}$.
Applying the BrezisHaraux result to our given operators A and M, we obtain the following.
Corollary 2.6 The operator A + M is maximally monotone and$\overline{\text{ran(}\mathbf{A}\text{+}\mathbf{M})}=\overline{{\Delta}^{\perp}+\text{ran}\phantom{\rule{0.1em}{0ex}}\mathbf{A}}$
Proof. Since each A_{ i } is maximally monotone and recalling Theorem 2.2(i), we see that A and M are maximally monotone. On the other hand, dom M = X. Thus, by the well known sum theorem for maximally monotone operators (see, e.g., [[1], Corollary 24.4(i)]), A + M is maximally monotone. Furthermore, by Theorem 2.2(ii) and (iv), M is rectangular and ran M = Δ^{┴}. The result therefore follows from Fact 2.5.
3 Composition
We now use Corollary 2.6 to study the composition. When m = 2, then Theorem 3.1(v) also follows from [[21], p. 124].
Theorem 3.1 Suppose that$(\forall i\in I)0\in \overline{\text{ran(Id}{T}_{i})}$. Then the following hold.

(i)
$0\in \overline{\text{ran(}\mathbf{A}+\mathbf{M})}$.

(ii)
(∀ε > 0) (∃(b, x) ∈ X × X) b ≤ ε and x = T(b + Rx).

(iii)
(∀ε > 0) (∃(c, x) ∈ X × X) c ≤ ε and x = c + T(Rx).

(iv)
(∀ε > 0) (∃x ∈ X) (∀i ∈ I) T _{i1}· · · T _{1} x_{ m }  T _{ i } T _{i1}· · · T _{1} x_{ m }  x _{i1}+ x _{ i } ≤ (2i  1)ε, where x _{0} = x _{ m }.

(v)
(∀ε > 0) (∃x ∈ X) x  T _{ m } T _{m1}· · · T _{1} x ≤ m ^{2} ε.
Proof. (i): The assumptions and (3) imply that $\left(\forall i\in I\right)0\in \overline{\mathsf{\text{ran}}\phantom{\rule{2.77695pt}{0ex}}{A}_{i}}$. Hence, $\mathbf{0}\in \overline{\mathsf{\text{ran}}\phantom{\rule{2.77695pt}{0ex}}\mathbf{A}}$. Obviously, 0 ∈ Δ^{┴}. It follows that $\mathbf{0}\in \overline{{\Delta}^{\perp}+\mathsf{\text{ran}}\phantom{\rule{2.77695pt}{0ex}}\mathbf{A}}$. Thus, by Corollary 2.6, $0\in \overline{\text{ran(}\mathbf{A}+\mathbf{M})}$.
(ii): Fix ε > 0. In view of (i), there exists x ∈ X and b ∈ X such that b ≤ ε and b ∈ Ax + Mx. Hence b + Rx ∈ (Id + A)x and thus x = J_{ A }(b + Rx) = T(b + Rx).
(iii): Let ε > 0. By (ii), there exists (b, x) ∈ X × X such that b ≤ ε and x = T(b + Rx). Set c = x  T(Rx) = T(b + Rx)  T(Rx). Then, since T is nonexpansive, c = T(b + Rx)  T(Rx) ≤ b ≤ ε.
(iv): Take ε > 0. Then, by (iii), there exists x ∈ X and c ∈ X such that c ≤ ε and x = c + T(Rx). Let i ∈ I. Then x_{ i }= c_{ i }+ T_{ i }x_{i1}. Since c_{ i }  ≤ c ≤ ε and T_{ i } is nonexpansive, we have
We thus obtain inductively
Hence,
The conclusion now follows from adding (33) and (34), and recalling the triangle inequality
(v): Let ε > 0. In view of (iv), there exists x ∈ X such that
where x_{0}= x_{ m }. Now set (∀i ∈ I) e_{ i }= T_{i1}· · · T_{1}x_{ m } T_{ i }T_{i1}· · · T_{1}x_{ m } x_{i1}+ x_{ i }. Then (∀i ∈ I)e_{ i } ≤ (2i1)ε. Set x = x_{ m }. Then
This, (35), and the triangle inequality imply that
This completes the proof.
Corollary 3.2 Suppose that$(\forall i\in I)0\in \overline{\text{ran(Id}{T}_{i})}$. Then$0\in \overline{\text{ran(Id}{T}_{m}{T}_{m1}\cdots {T}_{1})}$.
Proof. This follows from Theorem 3.1(v).
Remark 3.3 The converse implication in Corollary 3.2 fails in general: indeed, consider the case when X ≠ {0}, m = 2, and v ∈ X \ {0}. Now set T_{1}X → X: x ↦ x + v and set T_{2}X → X: x ↦ x  v. Then $0\notin \overline{\text{ran(Id}{T}_{1})}=\{v\}$ and $0\notin \overline{\text{ran(Id}{T}_{2})}=\left\{v\right\}$ however, T_{2}T_{1} = Id and $\overline{\text{ran(Id}{T}_{2}{T}_{1})}=\left\{0\right\}$.
Remark 3.4 Corollary 3.2 is optimal in the sense that even if (∀i ∈ I) we have 0 ∈ ran(Id  T_{ i } ), we cannot deduce that 0 ∈ ran(Id  T_{ m }T_{m1}· · · T_{1}): indeed, suppose that $X={\mathbb{R}}^{2}$ and m = 2. Set C_{1} : = epi exp and ${C}_{2}:=\mathbb{R}\times \left\{0\right\}$. Suppose further that ${T}_{1}={P}_{{C}_{1}}$ and ${T}_{2}={P}_{{C}_{2}}$. Then (∀i ∈ I) 0 ∈ ran(Id  T_{ i } ); however, $0\in \overline{\text{ran(Id}{T}_{2}{T}_{1})}\backslash \text{ran(Id}{T}_{2}{T}_{1})$.
4 Asymptotic regularity
The following notions (taken from Bruck and Reich's seminal article [22]) will be very useful to obtain stronger results.
Definition 4.1 ((strong) nonexpansiveness and asymptotic regularity) Let S: X → X. Then:

(i)
S is nonexpansive if (∀x ∈ X)(∀y ∈ X) Sx  Sy ≤ x  y.

(ii)
S is strongly nonexpansive if S is nonexpansive and whenever (x_{ n } )_{n∈ℕ} and (y_{ n } )_{n∈ℕ} are sequences in X such that (x_{ n }  y_{ n } )_{n∈ℕ} is bounded and x_{ n }  y_{ n } Sx_{ n }  Sy_{ n } → 0, it follows that (x_{ n }  y_{ n } )  (Sx_{ n }  Sy_{ n } ) → 0.

(iii)
S is asymptotically regular if (∀x ∈ X) S ^{n} x  S ^{n+1} x → 0.
The following result illustrates that strongly nonexpansive mappings generalize the notion of a firmly nonexpansive mapping. In addition, the class of strongly nonexpansive mappings is closed under compositions. (In contrast, the composition of two (necessarily firmly nonexpansive) projectors may fail to be firmly nonexpansive.)
Fact 4.2 (Bruck and Reich) The following hold.

(i)
Every firmly nonexpansive mapping is strongly nonexpansive.

(ii)
The composition of finitely many strongly nonexpansive mappings is also strongly nonexpansive.
Proof. (i): See [[22], Proposition 2.1]. (ii): See [[22], Proposition 1.1].
The sequences of iterates and of differences of iterates have striking convergence properties as we shall see now. In passing, we note that Fact 4.3(i) also appears in [[21], Theorem 3.7(b)] even in certain Banach spaces.
Fact 4.3 (Bruck and Reich) Let S: X → X be strongly nonexpansive and let x ∈ X. Then the following hold.

(i)
The sequence (S ^{n} x  S ^{n+1} x)_{n∈ℕ} converges strongly to the unique element of least norm in $\overline{\text{ran(Id}S)}$.
(ii) If Fix S = Ø, then S^{n}x→ + ∞.

(iii)
If Fix S ≠ Ø, then (S^{n}x)_{n∈ℕ} converges weakly to a fixed point of S.
Proof (i): See [[22], Corollary 1.5]. (ii): See [[22], Corollary 1.4]. (iii): See [[22], Corollary 1.3].
Suppose S: X → X is asymptotically regular. Then, for every x ∈ X, 0 ← S^{n}x  S^{n+1}x = (Id  S)S^{n} x ∈ ran(Id  S ) and hence $0\in \overline{\text{ran(Id}S)}$. The opposite implication fails in general (consider S =  Id), but it is true for strongly nonexpansive mappings. Under the assumption that S is firmly nonexpansive, the following result also follows from [[23], Corollary 2].
Corollary 4.4 Let S: X → X be strongly nonexpansive. Then S is asymptotically regular if and only if$0\in \overline{\text{ran(Id}S)}$.
Proof. "⇒": Clear. "⇐": Fact 4.3(i).
Corollary 4.5 Set S = T_{ m }T_{m  1}· · ·T_{1}. Then S is asymptotically regular if and only if$0\in \overline{\text{ran(Id}S)}$.
Proof Since each T_{ i } is firmly nonexpansive, it is also strongly nonexpansiveby Fact 4.2(i). By Fact 4.2(ii), S is strongly nonexpansive. Now apply Corollary 4.4. Alternatively, $0\in \overline{\text{ran(Id}S)}$ by Corollary 3.2 and again Corollary 4.4 applies.
We are now ready for our first main result. When m = 2, then the conclusion also follows from [[21], p. 124].
Theorem 4.6 Suppose that each T_{ i } is asymptotically regular. Then T_{ m }T_{m  1}· · · T_{1}is asymptotically regular as well.
Proof. Theorem 3.1(v) implies that $0\in \overline{\text{ran(Id}{T}_{m}{T}_{m1}\cdots {T}_{1})}$. The conclusion thus follows from Corollary 4.5.
As an application of Theorem 4.6, we obtain the main result of [17].
Example 4.7 Let C_{1}, ..., C_{ m } be nonempty closed convex subsets of X. Then the composition of the corresponding projectors, ${P}_{{C}_{m}}{P}_{{C}_{m1}}\dots {P}_{{C}_{1}}$ is asymptotically regular.
Proof. For every i ∈ I, the projector ${P}_{{C}_{i}}$ is firmly nonexpansive, hence strongly nonexpansive, and Fix ${P}_{{C}_{i}}={C}_{i}\ne \varnothing $. Suppose that $\left(\forall i\phantom{\rule{0.3em}{0ex}}\in \phantom{\rule{0.3em}{0ex}}I\right)\phantom{\rule{0.3em}{0ex}}{T}_{i}={P}_{{C}_{i}}$, which is thus asymptotically regular by Corollary 4.4. Now apply Theorem 4.6.
5 Convex combination
In this section, we use our fixed weights (λ_{ i } )_{i∈I}(see (12)) to turn X^{m} into a Hilbert product space different from X considered in the previous sections. Specifically, we set
so that x^{2} = ∑_{i∈I}λ_{ i }x_{ i }^{2}. We also set
Fact 5.1 (See [[1], Proposition 28.13].) In the Hilbert product space Y we have P_{Δ} = Q.
Corollary 5.2 In the Hilbert product space Y the operator Q is firmly nonexpansive and strongly nonexpansive. Furthermore, Fix Q = Δ ≠ Ø, 0 ∈ ran(Id Q), and Q is asymptotically regular.
Proof By Fact 5.1, the operator Q is equal to the projector P_{Δ} and hence firmly nonexpansive. Now apply Fact 4.2(i) to deduce that Q is strongly nonexpansive. It is clear that Fix Q = Δ and that 0 ∈ ran(Id  Q). Finally, recall Corollary 4.4 to see that Q is asymptotically regular.
Proposition 5.3 In the Hilbert product space Y the operator T is firmly nonexpansive.
Proof. Since each T_{ i } is firmly nonexpansive, we have (∀x = (x_{ i } )_{i∈I}∈ Y)(∀y = (y_{ i } )_{i∈I}∈ Y) T_{ i }x_{ i }  T_{ i }y_{ i }^{2} ≤ 〈x_{ i } y_{ i }, T_{ i }x_{ i } T_{ i }y_{ i }〉 ⇒ Tx  Ty^{2} = ∑_{i∈I}λ _{ i }T_{ i }x_{ i }  T_{ i }y_{ i }^{2} ≤ ∑_{ i }∈_{ I }λ_{ i }〈x_{ i } y_{ i }, T_{ i }x_{ i } T_{ i }y_{ i }〉 = 〈x  y, Tx  Ty〉.
Theorem 5.4 Suppose that$(\forall i\in I)0\in \overline{\text{ran(Id}{T}_{i})}$. Then the following hold in the Hilbert product space Y.

(i)
$\mathbf{0}\phantom{\rule{2.77695pt}{0ex}}\in \overline{\mathsf{\text{ran(}}\mathbf{I}\mathbf{d}\mathbf{T}\mathsf{\text{)}}}$.

(ii)
T is asymptotically regular.

(iii)
Q ○ T is asymptotically regular.
Proof. (i): This follows because (∀x = (x_{ i } )_{i∈I}) x  Tx^{2} = ∑_{i∈I}λ _{ i }x_{ i }  T_{ i }x_{ i }^{2}.
(ii): Combine Fact 4.2(i) with Corollary 4.4.
(iii): On the one hand, Q is firmly nonexpansive and asymptotically regular by Corollary 5.2. On the other hand, T is firmly nonexpansive and asymptotically regular by Proposition 5.3 and Theorem 5.4(ii). Altogether, the result follows from Theorem 4.6.
We are now ready for our second main result, which concerns convex combinations of firmly nonexpansive mappings. For further results in this directionnamely convex combinations of strongly nonexpansive mappings in Banach spaceswe refer the reader also to [24].
Theorem 5.5 Suppose that each T_{ i } is asymptotically regular. Then ∑_{i∈I}λ _{ i }T_{ i } is asymptotically regular as well.
Proof. Set S : = ∑_{i∈I}λ_{ i }T_{ i }. Fix x_{0} ∈ X and set (∀n ∈ ℕ) x_{n + 1}= Sx_{ n }. Set x_{0} = (x_{0})_{i∈I}∈ X^{m} and (∀n ∈ ℕ) x_{n + 1}= (Q ○ T)x _{ n }. Then (∀n ∈ ℕ) x_{ n } = (x_{ n } )_{i∈I}. Now Q ○ T is asymptotically regular by Theorem 5.4(iii); hence, x_{ n } x_{n + 1}= (x_{ n } x_{n + 1})_{i∈I}→ 0. Thus, x_{ n } x_{n + 1}→ 0 and therefore S is asymptotically regular.
Remark 5.6 Theorem 5.5 extends [[18], Theorem 4.11] from Euclidean to Hilbert space. One may also prove Theorem 5.5 along the lines of the article [18]; however, that route takes longer.
Remark 5.7 Similarly to Remark 3.4, one cannot deduce that if each T_{ i } has fixed points, then ∑_{i∈I}λ _{ i }T_{ i } has fixed points as well: indeed, consider the setting described in Remark 3.4 for an example.
We conclude this article by showing that we truly had to work in Y and not in X; indeed, viewed in X, the operator Q is generally not even nonexpansive.
Theorem 5.8 Suppose that X ≠ {0}. Then the following are equivalent in the Hilbert product space X.

(i)
(∀i ∈ I) λ _{ i } = 1/m.

(ii)
Q coincides with the projector P _{ Δ }.

(iii)
Q is firmly nonexpansive.

(iv)
Q is nonexpansive.
Proof. "(i)⇒(ii)": [[1], Proposition 25.4(iii)]. "(ii)⇒(iii)": Clear. "(iii)⇒(iv)": Clear. "(iv)⇒(i)": Take e ∈ X such that e = 1. Set x : = (λ _{ i }e) _{ i∈I } and $y:={\sum}_{i\in I}{\lambda}_{i}^{2}e$. Then Qx = (y)_{i∈I}. We compute ${\u2225\mathbf{Q}x\u2225}^{2}=m{\u2225y\u2225}^{2}=m{\left({\sum}_{i\in I}{\lambda}_{i}^{2}\right)}^{2}$ and ${\u2225\mathbf{x}\u2225}^{2}={\sum}_{i\in I}{\lambda}_{i}^{2}$. Since Q is nonexpansive, we must have that Qx^{2} ≤ x^{2}, which is equivalent to
and to
On the other hand, applying the CauchySchwarz inequality to the vectors (λ_{ i })_{i∈I}and (1)_{i∈I}in ℝ ^{m} yields
In view of (42), the CauchySchwarz inequality (43) is actually an equality which implies that (λ_{ i })_{i∈I}is a multiple of (1)_{i∈I}. We deduce that (∀i ∈ I) λ _{ i } = 1/m.
Endnotes
^{a}ker B = B^{1} 0 = {x ∈ X  Bx = 0 } denotes the kernel (or nullspace) of B.
^{b}Here and elsewhere we write S^{n} for the nfold composition of an operator S.
^{c}Recall that the Fenchel conjugate of a function f defined on X is given by f * : x* → sup_{x∈X}(〈x, x*〉f(x)).
References
 1.
Bauschke HH, Combettes PL: Convex Analysis and Monotone Operator Theory in Hilbert Spaces. Springer, New York; 2011.
 2.
Goebel K, Kirk WA: Topics in Metric Fixed Point Theory. Cambridge University Press, Cambridge; 1990.
 3.
Goebel K, Reich S: Uniform Convexity, Hyperbolic Geometry, and Nonexpansive Mappings. Marcel Dekker, New York; 1984.
 4.
Minty GJ: Monotone (nonlinear) operators in Hilbert spaces. Duke Math J 1962, 29: 341–346. 10.1215/S0012709462029332
 5.
Eckstein J, Bertsekas DP: On the DouglasRachford splitting method and the proximal point algorithm for maximal monotone operators. Math Program (Ser A) 1992, 55: 293–318. 10.1007/BF01581204
 6.
Borwein JM, Vanderwerff JD: Convex Functions. Cambridge University Press, Cambridge; 2010.
 7.
Brézis H: Operateurs Maximaux Monotones et SemiGroupes de Contractions dans les Espaces de Hilbert. NorthHolland/Elsevier, New York; 1973.
 8.
Burachik RS, Iusem AN: SetValued Mappings and Enlargements of Monotone Operators. SpringerVerlag, New York; 2008.
 9.
Rockafellar RT: Convex Analysis. Princeton University Press, Princeton; 1970.
 10.
Rockafellar RT, Wets RJB: Variational Analysis, corrected 3rd printing. SpringerVerlag, Berlin; 2009.
 11.
Simons S: Minimax and Monotonicity. SpringerVerlag, Berlin; 1998.
 12.
Simons S: From HahnBanach to Monotonicity. SpringerVerlag, New York; 2008.
 13.
Zălinescu C: Convex Analysis in General Vector Spaces. World Scientific Publishing, River Edge, NJ; 2002.
 14.
Zeidler E: Nonlinear Functional Analysis and Its Applications II/A: Linear Monotone Operators. SpringerVerlag, New York; 1990.
 15.
Zeidler E: Nonlinear Functional Analysis and Its Applications II/B: Nonlinear Monotone Operators. SpringerVerlag, New York; 1990.
 16.
Zeidler E: Nonlinear Functional Analysis and Its Applications I: Fixed Point Theorems. SpringerVerlag, New York; 1993.
 17.
Bauschke HH: The composition of finitely many projections onto closed convex sets in Hilbert space is asymptotically regular. Proc Am Math Soc 2003, 131: 141–146. 10.1090/S0002993902065280
 18.
Bauschke HH, Moffat SM, Wang X: Near equality, near convexity, sums of maximally monotone operators, and averages of firmly nonexpansive mappings. Mathematical Programming, in press. http://arxiv.org/pdf/1105.0029v1
 19.
Groetsch CW: Generalized Inverses of Linear Operators. Marcel Dekker, New York; 1977.
 20.
Brézis H, Haraux A: Image d'une somme d'opérateurs monotones et applications. Israel J Math 1976, 23: 165–186. 10.1007/BF02756796
 21.
Reich S: On the asymptotic behavior of nonlinear semigroups and the range of accretive operators. J Math Anal Appl 1981, 79: 113–126. 10.1016/0022247X(81)900135
 22.
Bruck RE, Reich S: Nonexpansive projections and resolvents of accretive operators in Banach spaces. Houston J Math 1977, 3: 459–470.
 23.
Reich S, Shafrir I: The asymptotic behavior of firmly nonexpansive mappings. Proc Am Math Soc 1987, 101: 246–250. 10.1090/S00029939198709025367
 24.
Reich S: A limit theorem for projections. Linear and Multilinear Algebra 1983, 13: 281–290. 10.1080/03081088308817526
Acknowledgements
The authors thank Simeon Reich, the editor, and the referees for constructive and pertinent comments. Part of this research was initiated during a research visit of VMM at the Kelowna campus of UBC in Fall 2009. HHB was partially supported by the Natural Sciences and Engineering Research Council of Canada and by the Canada Research Chair Program. VMM was partially supported by DGES, Grant BFM20091096C0201 and Junta de Andalucia, Grant FQM127. SMM was partially supported by the Natural Sciences and Engineering Research Council of Canada. XW was partially supported by the Natural Sciences and Engineering Research Council of Canada.
Author information
Affiliations
Corresponding author
Additional information
Competing interests
The authors declare that they have no competing interests.
Authors' contributions
All authors contributed equally to this research. All authors read and approved the final manuscript.
Rights and permissions
Open Access This article is distributed under the terms of the Creative Commons Attribution 2.0 International License (https://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
About this article
Cite this article
Bauschke, H.H., MartínMárquez, V., Moffat, S.M. et al. Compositions and convex combinations of asymptotically regular firmly nonexpansive mappings are also asymptotically regular. Fixed Point Theory Appl 2012, 53 (2012). https://doi.org/10.1186/16871812201253
Received:
Accepted:
Published:
Keywords
 asymptotic regularity
 firmly nonexpansive mapping
 Hilbert space
 maximally monotone operator
 nonexpansive mapping
 resolvent
 strongly nonexpansive mapping