We explored the dependence of experimental bedrock erosion rate on shear stress, bed load sediment flux, alluvial bed cover, and evolving channel morphology. We isolated these variables experimentally by systematically varying gravel sediment flux Qs and water discharge Qw in a laboratory flume, gradually abrading weak concrete “bedrock.” All else held constant, we found that (1) erosion rate was insensitive to flume‐averaged shear stress, (2) erosion rate increased linearly with sediment flux, (3) erosion rate decreased linearly with the extent of alluvial bed cover, and (4) the spatial distribution of bed cover was sensitive to local bed topography, but the extent of cover increased with Qs/Qt (where Qt is flume‐averaged transport capacity) once critical values of bed roughness and sediment flux were exceeded. Starting from a planar geometry, erosion increased bed roughness due to feedbacks between preferential sediment transport through interconnected topographic lows, focused erosion along these zones of preferential bed load transport, and local shear stresses that depended on the evolving bed morphology. Finally, continued growth of bed roughness was inhibited by imposed variability in discharge and sediment flux, due to changes in spatial patterns of alluvial deposition and impact wear. Erosion was preferentially focused at lower bed elevations when the bed was cover‐free, but was focused at higher bed elevations when static alluvial cover filled topographic lows. Natural variations in discharge and sediment flux may thus stabilize and limit the growth of roughness in bedrock channels due to the effects of partial bed cover.