Why would we want to develop artificial human-like arithmetical intelligence, when computers already outperform humans in arithmetical calculations? Aside from arithmetic consisting of much more than mere calculations, one suggested reason is that AI research can help us explain the development of human arithmetical cognition. Here I argue that this question needs to be studied already in the context of basic, non-symbolic, numerical cognition. Analyzing recent machine learning research on artificial neural networks, I show how AI studies could potentially shed light on the development of human numerical abilities, from the proto-arithmetical abilities of subitizing and estimating to counting procedures. Although the current results are far from conclusive and much more work is needed, I argue that AI research should be included in the interdisciplinary toolbox when we try to explain the development and character of numerical cognition and arithmetical intelligence. This makes it relevant also for the epistemology of mathematics.