We have found a formalism that lets us present generalizations of several stability theorems (see Chua & Roska, 1990; Chua & Wu, 1992; Gilli, 1993; Forti, 2002] on Multi-Layer Cellular Neural/Nonlinear Networks (MLCNN) formerly claimed for Single-Layer Cellular Neural/Nonlinear Networks (CNN). The theorems were selected with special regard to usefulness in engineering applications. Hence, in contrast to many works considering stability on recurrent neural networks, the criteria of the new theorems have clear indications that are easy to verify directly on the template values. Proofs of six new theorems on 2-Layer CNNs (2LCNN) related to symmetric, τ-symmetric, nonsymmetric, τ-nonsymmetric, and sign-symmetric cases are given. Furthermore, a theorem with a proof on a MLCNN with arbitrary template size and arbitrary layer number in relation to the sign-symmetric theorem is given, along with a conjecture for the one-dimensional, two-layer, nonreciprocal case.
Read full abstract