The paper introduces a novel modality to efficiently tune the convolutional layers of a deep neural network (CNN) and an approach to also rank the importance of the involved hyperparameters. Evolutionary algorithms (EA) offer a flexible solution to this twofold issue, while the expensive simulations of the deep learner with the generated configurations are resolved by surrogate modelling. Three models have been used and evaluated as surrogates: random forests (RF), support vector machines (SVM) and Kriging. Sample convolutional configurations are generated by Latin hypercube sampling and have attached computed accuracy outcomes from real CNN runs. For the hyperparameter estimation task, the fitness of an individual from the EA associated with a surrogate model is subsequently derived from the CNN accuracy estimation on those variable values. With respect to the ranking and variable selection task, RF includes implicit variable selection, the SVM can be straightforwardly supported by a second EA, and Kriging offers a ranking based on the corresponding θ values. The estimated accuracy of the found hyperparameter values is compared with the true validation accuracy, and they are next used for the prediction on the test cases. The ranking of the variables for each of the three surrogate models is compared, and their influence is also revealed by response surface methodology. The experimental testing of the proposed EA–surrogate approaches is conducted on a real-world scenario of histopathological image interpretation in colorectal cancer diagnosis.