Synaptic plasticity plays a critical role in the expression power of brain neural networks. Among diverse plasticity rules, synaptic scaling presents indispensable effects on homeostasis maintenance and synaptic strength regulation. In the current modeling of brain-inspired spiking neural networks (SNN), backpropagation through time is widely adopted because it can achieve high performance using a small number of time steps. Nevertheless, the synaptic scaling mechanism has not yet been well touched. In this work, we propose an experience-dependent adaptive synaptic scaling mechanism (AS-SNN) for spiking neural networks. The learning process has two stages: First, in the forward path, adaptive short-term potentiation or depression is triggered for each synapse according to afferent stimuli intensity accumulated by presynaptic historical neural activities. Second, in the backward path, long-term consolidation is executed through gradient signals regulated by the corresponding scaling factor. This mechanism shapes the pattern selectivity of synapses and the information transfer they mediate. We theoretically prove that the proposed adaptive synaptic scaling function follows a contraction map and finally converges to an expected fixed point, in accordance with state-of-the-art results in three tasks on perturbation resistance, continual learning, and graph learning. Specifically, for the perturbation resistance and continual learning tasks, our approach improves the accuracy on the N-MNIST benchmark over the baseline by 44% and 25%, respectively. An expected firing rate callback and sparse coding can be observed in graph learning. Extensive experiments on ablation study and cost evaluation evidence the effectiveness and efficiency of our nonparametric adaptive scaling method, which demonstrates the great potential of SNN in continual learning and robust learning.