The aim of this paper is twofold. First, we show that a certain concatenation of a proximity operator with an affine operator is again a proximity operator on a suitable Hilbert space. Second, we use our findings to establish so-called proximal neural networks (PNNs) and stable tight frame proximal neural networks. Let mathcal {H} and mathcal {K} be real Hilbert spaces, b in mathcal {K} and T in mathcal {B} (mathcal {H},mathcal {K}) a linear operator with closed range and Moore–Penrose inverse T^dagger . Based on the well-known characterization of proximity operators by Moreau, we prove that for any proximity operator mathrm {Prox}:mathcal {K}rightarrow mathcal {K} the operator T^dagger , mathrm {Prox}( T cdot + b) is a proximity operator on mathcal {H} equipped with a suitable norm. In particular, it follows for the frequently applied soft shrinkage operator mathrm {Prox}= S_{lambda }:ell _2 rightarrow ell _2 and any frame analysis operator T:mathcal {H}rightarrow ell _2 that the frame shrinkage operator T^dagger , S_lambda , T is a proximity operator on a suitable Hilbert space. The concatenation of proximity operators on mathbb R^d equipped with different norms establishes a PNN. If the network arises from tight frame analysis or synthesis operators, then it forms an averaged operator. In particular, it has Lipschitz constant 1 and belongs to the class of so-called Lipschitz networks, which were recently applied to defend against adversarial attacks. Moreover, due to its averaging property, PNNs can be used within so-called Plug-and-Play algorithms with convergence guarantee. In case of Parseval frames, we call the networks Parseval proximal neural networks (PPNNs). Then, the involved linear operators are in a Stiefel manifold and corresponding minimization methods can be applied for training of such networks. Finally, some proof-of-the concept examples demonstrate the performance of PPNNs.