Consider the variational inequality: $$Find \hat x \varepsilon K such that \beta (\hat x, x - \hat x) \geqslant \lambda (x - \hat x) for all x \varepsilon K$$ and its discretization: $$Find x_h \varepsilon K_h such that \beta (x_h , x - x_h ) \geqslant \lambda (x - x_h ) for all x \varepsilon K_h .$$ Here, in a real reflexive separable Banach spaceX, β is a continuous bilinear form onX × X that is nonnegative on the diagonal,λ ∈ X* is a continuous linear form, and\(K \subseteq X, K_h \subseteq X_h \) are closed convex nonvoid sets, where the family{Xh}h >o of subspaces ofX describes a discretization scheme. Then under Glowinski's realistic assumptions on the approximation ofK by{Kh}h > o—not requiring that\(K_h \subseteq K - \) we prove norm convergence,\(\lim _{h \to 0} \left\| {x_h - \hat x} \right\| = 0\), provided the solution\(\hat x\) is unique andβ satisfies a Garding inequality: There exist a compact operatorT1:X→X* and a positive constant α such thatβ(x, x)+〈T1x, x〉 ⩾ α∥x∥2 for allx ∈ X.