Recently Y. Meyer derived a characterization of the minimizer of the Rudin-Osher- Fatemi functional in a functional analytical framework. In statistics the discrete version of this functional is used to analyze one dimensional data and belongs to the class of nonparametric regres- sion models. In this work we generalize the functional analytical results of Meyer and apply them to a class of regression models, such as quantile, robust, logistic regression, for the analysis of multi- dimensional data. The characterization of Y. Meyer and our generalization is based on G-norm properties of the data and the minimizer. A geometric point of view of regression minimization is provided. whereDudenotes the total variation semi-norm of u and α> 0. The minimizer is called the bounded variation regularized solution. The taut-string algorithm consists in finding a string of minimal length in a tube (with radius α) around the primitive of f . The differentiated string is the taut-string reconstruction and corresponds to the minimizer of the ROF-model. Generalizing these ideas to higher dimensions is complicated by the fact that there is no obvious analog to primitives in higher space dimensions. We overcome this difficulty by solving Laplace's equation with right hand side f (i.e. integrate twice), and differentiating. The tube with radius α around the derivative of the potential specifies all functions u which satisfyu − fGs ≤ α (see also (21)). In this paper we show that the bounded variation regularized solutions (in any number of space dimensions) are contained in a tube of radius α .F or several other regression models in statistics, such as robust, quantile, and logistic regression (reformulated in a Banach space setting for analyzing multi-dimensional data) the