Domain decomposition methods are successful and highly parallel scalable iterative solution methods for discretized partial differential equations. Nevertheless, for many classes of problems, for example, elliptic partial differential equations with arbitrary coefficient distributions, adaptive coarse spaces are necessary to obtain robustness or, in other words, to guarantee a reliable and fast convergence. Adaptive coarse spaces are usually computed by solving many localized eigenvalue problems related to edges or faces of the domain decomposition. This results in a computationally expensive setup of the domain decomposition preconditioner or system operator. In earlier work, to which the authors have contributed, a deep learning based classification model was used to classify the critical edges or faces where eigenvalue problems have to be solved. In the present paper, we suggest to directly learn the adaptive constraints using a deep feedforward neural network regression model and thus completely skip the computationally most expensive part of the setup, i.e., the solution of local eigenvalue problems. We consider a specific adaptive FETI-DP (Finite Tearing and Interconnecting – Dual Primal) approach and concentrate on stationary diffusion problems in two dimensions with arbitrary coefficient functions with large jumps. As an input for the neural network, we use an image representation of the coefficient function which resolves the structure of the coefficient distribution but is not necessarily identical to the discretization of the partial differential equation. Therefore, our approach is independent of the finite element mesh and can, in principle, be easily extended to other adaptive coarse spaces, problems, and domain decomposition methods. We show the robustness of our method for different problems and the generalization property of our trained neural networks by considering different coefficient distributions not contained in the training set. We also combine the learned constraints with computationally cheap frugal constraints to further improve our approach.