Hebbian-type learning rules, which underlie learning and refinement of neuronal connectivity, postulate input specificity of synaptic changes. However, theoretical analyses have long appreciated that additional mechanisms, not restricted to activated synapses, are needed to counteract positive feedback imposed by Hebbian-type rules on synaptic weight changes and to achieve stable operation of learning systems. The biological basis of such mechanisms has remained elusive. Here we show that, in layer 2/3 pyramidal neurons from slices of visual cortex of rats, synaptic changes induced at individual synapses by spike timing-dependent plasticity do not strictly follow the input specificity rule. Spike timing-dependent plasticity is accompanied by changes in unpaired synapses: heterosynaptic plasticity. The direction of heterosynaptic changes is weight-dependent, with balanced potentiation and depression, so that the total synaptic input to a cell remains preserved despite potentiation or depression of individual synapses. Importantly, this form of heterosynaptic plasticity is induced at unpaired synapses by the same pattern of postsynaptic activity that induces homosynaptic changes at paired synapses. In computer simulations, we show that experimentally observed heterosynaptic plasticity can indeed serve the theoretically predicted role of robustly preventing runaway dynamics of synaptic weights and activity. Moreover, it endows model neurons and networks with essential computational features: enhancement of synaptic competition, facilitation of the development of specific intrinsic connectivity, and the ability for relearning. We conclude that heterosynaptic plasticity is an inherent property of plastic synapses, crucial for normal operation of learning systems. We show that spike timing-dependent plasticity in L2/L3 pyramids from rat visual cortex is accompanied by plastic changes in unpaired synapses. These heterosynaptic changes are weight-dependent and balanced: individual synapses expressed significant LTP or LTD, but the average over all synapses did not change. Thus, the rule of input specificity breaks down at individual synapses but holds for responses averaged over many inputs. In model neurons and networks, this experimentally characterized form of heterosynaptic plasticity prevents runaway dynamics of synaptic weights and activity, enhances synaptic competition, facilitates development of specific intrinsic connectivity, and enables relearning. This new form of heterosynaptic plasticity represents the cellular basis of a theoretically postulated mechanism, which is additional to Hebbian-type rules, and is necessary for stable operation of learning systems.