A general mathematical description of how the brain sequentially encodes knowledge remains elusive. We propose a linear solution for serial learning tasks, based on the concept of mixed selectivity in high-dimensional neural state spaces. In our framework, neural representations of items in a sequence are projected along a “geometric” mental line learned through classical conditioning. The model successfully solves serial position tasks and explains behaviors observed in humans and animals during transitive inference tasks amidst noisy sensory input and stochastic neural activity. This approach extends to recurrent neural networks performing motor decision tasks, where the same geometric mental line correlates with motor plans and modulates network activity according to the symbolic distance between items. Serial ordering is thus predicted to emerge as a monotonic mapping between sensory input and behavioral output, highlighting a possible pivotal role for motor-related associative cortices in transitive inference tasks.