Recurrence microstates are obtained from the cross recurrence of two sequences of values embedded in a time series, being the generalization of the concept of recurrence of a given state in phase space. The probability of occurrence of each microstate constitutes a recurrence quantifier. The set of probabilities of all microstates are capable of detecting even small changes in the data pattern. This creates an ideal tool for generating features in machine learning algorithms. Thanks to the sensitivity of the set of probabilities of occurrence of microstates, it can be used to feed a deep neural network, namely, a microstate multi-layer perceptron (MMLP) to classify parameters of chaotic systems. Additionally, we show that with more microstates, the accuracy of the MMLP increases, showing that the increasing size and number of microstates insert new and independent information into the analysis. We also explore potential applications of the proposed method when adapted to different contexts.