Abstract Game development often requires a multidisciplinary team, demands substantial time and budget, and leads to a limited number of game contents (e.g., levels). Procedural Content Generation (PCG) can remedy some of these problems, aiding with the automatic creation of content such as levels and graphics, in both the development and playing time. However, little research has been performed in terms of how PCG influences players, especially on Digital Math Games (DMG). This article addresses this problem by investigating the interactions of players with a DMG that uses PCG, investigating the hypothesis that interacting with this intervention can provide experiences as good as human-designed content. To accomplish this goal, an A/B test was performed wherein the only difference was that one version (static, N = 242) had human-designed levels, whereas the other (dynamic, N = 265) provided procedurally generated levels. To validate the approach, a two-sample experiment was designed in which each sample played a single version and, thereafter, self-reported their experiences through questionnaires. We contribute by showing how the participants interactions with a DMG are reported in terms of (1) fun, (2) willingness to play the game again, and (3) curiosity, in addition to how they (4) describe their experiences. Our findings show that samples’ experiences did not significantly differ on the four metrics, but did differ on in-game performance. We discuss possible factors that might have influenced players’ experiences, in terms of the participants performances and their demographic attributes, and how our findings contribute to human interaction with computers.