Abstract

Predictions generated from optimality models are inescapably based on a number of assumptions. The predictive value of these models is often determined by the degree to which the behavior of an organism fits the underlying assumptions of the model. I analyzed optimal diet choice by relaxing two sets of assumptions made in previous optimality models. (1) Foraging-bout length (the uninterrupted time devoted just to foraging), generally treated as infinitely long, was shown to affect optimal diet choice. For many foragers, foraging-bout length may be considerably shortened by the presence of predators, or by physical or social features of the forager's environment. A model was derived which incorporates a short bout length into the decision of diet choice. The model predicts that animals should become more catholic in their diet choice as the amount of uninterrupted foraging time decreases. This prediction appears to be supported by three studies from the literature. Jaeger et al. (1981) showed that salamanders incorporated more lower ranked prey (small flies) when they were either on the territory of a conspecific or on no territory as compared with prey choice when they were on their own territory. In this case, foraging time was uninterrupted when the salamanders were feeding selectively, but continuously interrupted by submissive behavior and marking behavior when no diet choice was exhibited. Freed (1981) showed that wrens foraging for nestlings spent less time per foraging bout when a predator was in the nesting area than when no predator was in sight. The reduction in foraging bout time correlated with a reduction in prey size fed to the young. The foraging time of some intertidal snails was shown to be confined by the length of the low tide cycle (Menge 1974). As the end of the low tide drew near, the snails decreased diet selectivity. Thus, as the remaining time available for foraging decreased, the predator exhibited a lower degree of prey selection. (2) Variance in prey encounter interval was shown to affect the utility of classical optimal diet models in predicting the optimal diet. Charnov's (1976) model is shown to overestimate the net rate of energy intake when mean encounter rate varies about some fixed level. Predictions from Charnov's model are incorrect over some ranges of prey encounter rates because of this overestimation. I show that as variance in prey encounter rate increases, the time over which the forager estimates prey encounter rate will have a strong effect on the ability of the forager to maximize the net rate of energy intake. Foragers that forage on patchily distributed prey should use a shorter amount of time to estimate prey density than foragers that prey on evenly dispersed prey. Thus, animals that are capable of reducing the time required to estimate prey density (for example, predators that hunt by sight in areas of high prey density) should alter their diet in response to local variation in prey density. For this type of forager, as variance in prey encounter rate increases, fluctuations in the number of prey types in the diet will increase. As a result, there should be an increase in the degree of partial prey preference exhibited by the forager with increasing variance in prey encounter rate.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call