Biological constraints often impose restrictions on plasticity rules such as locality and reward-based rather than supervised learning. Two learning rules that comply with these restrictions are weight (WP) and node (NP) perturbation. NP is often used in learning studies, in particular, as a benchmark; it is considered to be superior to WP and more likely neurobiologically realized, as the number of weights and, therefore, their perturbation dimension typically massively exceed the number of nodes. Here, we show that this conclusion no longer holds when we take two properties into account that are relevant for biological and artificial neural network learning: First, tasks extend in time and/or are trained in batches. This increases the perturbation dimension of NP but not WP. Second, tasks are (comparably) low dimensional, with many weight configurations providing solutions. We analytically delineate regimes where these properties let WP perform as well as or better than NP. Furthermore, we find that the changes in weight space directions that are irrelevant for the task differ qualitatively between WP and NP and that only in WP gathering batches of subtasks in a trial decreases the number of trials required. This may allow one to experimentally distinguish which of the two rules underlies a learning process. Our insights suggest new learning rules which combine for specific task types the advantages of WP and NP. If the inputs are similarly correlated, temporally correlated perturbations improve NP. Using numerical simulations, we generalize the results to networks with various architectures solving biologically relevant and standard network learning tasks. Our findings, together with WP’s practicability, suggest WP as a useful benchmark and plausible model for learning in the brain.2 MoreReceived 9 December 2021Revised 9 September 2022Accepted 23 January 2023DOI:https://doi.org/10.1103/PhysRevX.13.021006Published by the American Physical Society under the terms of the Creative Commons Attribution 4.0 International license. Further distribution of this work must maintain attribution to the author(s) and the published article’s title, journal citation, and DOI.Published by the American Physical SocietyPhysics Subject Headings (PhySH)Research AreasEvolving networksFluctuations & noiseNeuronal networksNeuroplasticityPhysical SystemsArtificial neural networksBrain networkLearningNetworksStatistical PhysicsBiological Physics