The brain is not only constrained by energy needed to fuel computation, but it is also constrained by energy needed to form memories. Experiments have shown that learning simple conditioning tasks which might require only a few synaptic updates, already carries a significant metabolic cost. Yet, learning a task like MNIST to 95% accuracy appears to require at least 108 synaptic updates. Therefore the brain has likely evolved to be able to learn using as little energy as possible. We explored the energy required for learning in feedforward neural networks. Based on a parsimonious energy model, we propose two plasticity restricting algorithms that save energy: 1) only modify synapses with large updates, and 2) restrict plasticity to subsets of synapses that form a path through the network. In biology networks are often much larger than the task requires, yet vanilla backprop prescribes to update all synapses. In particular in this case, large savings can be achieved while only incurring a slightly worse learning time. Thus competitively restricting plasticity helps to save metabolic energy associated to synaptic plasticity. The results might lead to a better understanding of biological plasticity and a better match between artificial and biological learning. Moreover, the algorithms might benefit hardware because also electronic memory storage is energetically costly.