Ensemble learning is a powerful machine learning strategy that combines multiple models e.g. classifiers to improve predictions beyond what any single model can achieve. Until recently, traditional ensemble methods typically use only one layer of models which limits the exploration of different aspects in the classifiers’ predictions. On the other hand, the rise of deep learning has introduced multi-layer architectures that can learn complex functions by transforming data into multiple levels of representation. This characteristic of deep learning suggests that multi-layer ensembles may potentially provide better performance compared to single-layer ensembles. However, a problem which might arise is that in the subsequent layers, not all the inputs to a classifier are desirable, leading to lower performance. In this paper, we introduce a novel multi-layer ensemble of classifiers named COME in which each classifier at a specific layer is connected to multiple classifiers in the previous layer. These connections signify the use of the previous-layer-classifiers’ outputs as inputs for training the current layer's classifier. Each classifier can be connected to different classifiers in the previous layer, which allows inputs in each layer to be optimally selected. We propose a binary encoding scheme to encode the topology of the proposed multi-layer ensemble with defined connections between layers. Differential Evolution, a popular evolutionary computation method, is used as the optimisation algorithm to search for the optimal set of connections. Experimental results on 30 datasets from the UCI Machine Learning Repository and OpenML demonstrate that our proposed ensemble outperforms many state-of-the-art ensemble learning algorithms.