Random noise in information processing systems is widely seen as detrimental to function. But despite the large trial-to-trial variability of neural activity, humans show a remarkable adaptability to conditions with uncertainty during goal-directed behavior. The origin of this cognitive ability, constitutive of general intelligence, remains elusive. Here, we show that moderate levels of computation noise in artificial neural networks promote zero-shot generalization for decision-making under uncertainty. Unlike networks featuring noise-free computations, but like human participants tested on similar decision problems (ranging from probabilistic reasoning to reversal learning), noisy networks exhibit behavioral hallmarks of optimal inference in uncertain conditions entirely unseen during training. Computation noise enables this cognitive ability jointly through "structural" regularization of network weights during training and "functional" regularization by shaping the stochastic dynamics of network activity after training. Together, these findings indicate that human cognition may ride on neural variability to support adaptive decisions under uncertainty without extensive experience or engineered sophistication.