A hybrid neural network architecture is investigated for classification purposes. The proposed hybrid is based on the multilayer perceptron (MLP) network. In addition to the usual hidden layers the first hidden layer is selected to be an adaptive centroid layer. Each unit in this new layer incorporates a centroid vector that is located somewhere in the space spanned by the input variables. The output of these units is the Euclidean distance between the centroid vector and the inputs. The centroid layer has some resemblance to the hidden layer of the radial basis function (RBF) networks. Therefore the proposed design can be regarded as a sort of hybrid of the MLP and RBF networks. The presented benchmark experiments demonstrate that the proposed hybrid can provide significant advantages over standard MLPs in terms of fast and efficient learning, and compact network structure.