Skip to main content

Advertisement

We’d like to understand how you use our websites in order to improve them. Register your interest.

A Generalized Hybrid Steepest-Descent Method for Variational Inequalities in Banach Spaces

Abstract

The hybrid steepest-descent method introduced by Yamada (2001) is an algorithmic solution to the variational inequality problem over the fixed point set of nonlinear mapping and applicable to a broad range of convexly constrained nonlinear inverse problems in real Hilbert spaces. Lehdili and Moudafi (1996) introduced the new prox-Tikhonov regularization method for proximal point algorithm to generate a strongly convergent sequence and established a convergence property for it by using the technique of variational distance in Hilbert spaces. In this paper, motivated by Yamada's hybrid steepest-descent and Lehdili and Moudafi's algorithms, a generalized hybrid steepest-descent algorithm for computing the solutions of the variational inequality problem over the common fixed point set of sequence of nonexpansive-type mappings in the framework of Banach space is proposed. The strong convergence for the proposed algorithm to the solution is guaranteed under some assumptions. Our strong convergence theorems extend and improve certain corresponding results in the recent literature.

Publisher note

To access the full article, please see PDF.

Author information

Affiliations

Authors

Corresponding author

Correspondence to N. C. Wong.

Rights and permissions

Reprints and Permissions

About this article

Cite this article

Sahu, D.R., Wong, N.C. & Yao, J.C. A Generalized Hybrid Steepest-Descent Method for Variational Inequalities in Banach Spaces. Fixed Point Theory Appl 2011, 754702 (2011). https://doi.org/10.1155/2011/754702

Download citation

Keywords

  • Hilbert Space
  • Banach Space
  • Variational Inequality
  • Strong Convergence
  • Regularization Method