Object recognition tests are widely used in neuroscience to assess memory function in rodents. Despite the experimental simplicity of the task, the interpretation of behavioural features that are counted as object exploration can be complicated. Thus, object exploration is often analysed by manual scoring, which is time-consuming and variable across researchers. Current software using tracking points often lacks precision in capturing complex ethological behaviour. Switching or losing tracking points can bias outcome measures. To overcome these limitations we developed “EXPLORE”, a simple, ready-to use and open source pipeline. EXPLORE consists of a convolutional neural network trained in a supervised manner, that extracts features from images and classifies behaviour of rodents near a presented object. EXPLORE achieves human-level accuracy in identifying and scoring exploration behaviour and outperforms commercial software with higher precision, higher versatility and lower time investment, in particular in complex situations. By labeling the respective training data set, users decide by themselves, which types of animal interactions on objects are in- or excluded, ensuring a precise analysis of exploration behaviour. A set of graphical user interfaces (GUIs) provides a beginning-to-end analysis of object recognition tests, accelerating a fast and reproducible data analysis without the need of expertise in programming or deep learning.