The paper considers a class of discrete-time cellular neural networks (DT-CNNs) obtained by applying Euler’s discretization scheme to standard CNNs. Let [Formula: see text] be the DT-CNN interconnection matrix which is defined by the feedback cloning template. The paper shows that a DT-CNN is convergent, i.e. each solution tends to an equilibrium point, when [Formula: see text] is symmetric and, in the case where [Formula: see text] is not positive-semidefinite, the step size of Euler’s discretization scheme does not exceed a given bound ([Formula: see text] is the [Formula: see text] unit matrix). It is shown that two relevant properties hold as a consequence of the local and space-invariant interconnecting structure of a DT-CNN, namely: (1) the bound on the step size can be easily estimated via the elements of the DT-CNN feedback cloning template only; (2) the bound is independent of the DT-CNN dimension. These two properties make DT-CNNs very effective in view of computer simulations and for the practical applications to high-dimensional processing tasks. The obtained results are proved via Lyapunov approach and LaSalle’s Invariance Principle in combination with some fundamental inequalities enjoyed by the projection operator on a convex set. The results are compared with previous ones in the literature on the convergence of DT-CNNs and also with those obtained for different neural network models as the Brain-State-in-a-Box model. Finally, the results on convergence are illustrated via the application to some relevant 2D and 1D DT-CNNs for image processing tasks.