Abstract

AbstractPrey often induce antipredator behaviors when balancing food acquisition against safety. The starvation–predation hypothesis (SPH) posits that, during food shortages, the risk of starvation requires prey to forego antipredator behavior to increase feeding rates. Such shifts in antipredator behavior may further increase the risk of predation and therefore kill rates by predators. We tested the SPH and its consequences for kill rates in a single large prey, single large predator system. In the Argentine Andes, we evaluated whether risk avoidance by vicuñas (Vicugna vicugna) decreased during periods of food scarcity. From three years of GPS relocations collected simultaneously from vicuñas and pumas (Puma concolor), resource selection functions revealed that vicuñas increased their exposure to pumas during nongrowing seasons by reducing the avoidance of canyons and increasing selection for meadows, both of which offer more food of higher quality than relatively safe plains. However, and despite vicuñas becoming more risk‐prone during nongrowing seasons, kill rates by pumas did not change between growing and nongrowing seasons. Contrary to evidence from mesocosm experiments, relaxation of antipredator behavior by prey did not translate into increased kill rates by predators. Our results enhance understanding of the interplay between food limitation and predator–prey interactions within ecosystems and may improve ecologists' ability to predict when and where behaviorally mediated trophic cascades are more likely to occur.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call