Abstract

Pollinators collect resources that are patchy, since flowers are usually aggregated on several spatial scales. Empirical studies have established that pollinators almost invariably visit a smaller proportion of flowers as patch size increases. This has not been adequately explained. Here I present data on the payoff curve achieved by bumblebees, Bombus lapidarius, when visiting patches containing different numbers of inflorescences, and use the marginal value theorem to predict the optimum duration of stay within patches. The data demonstrate that visiting a declining proportion of inflorescences as patch size increases is an optimal strategy, if we assume that bees are attempting to maximise their rate of reward acquisition. I argue that this occurs because searching for the remaining unvisited inflorescences is easier in a small patch. On large patches, bees visited more inflorescences per patch than predicted (although still visiting a declining proportion). I suggest that this may occur because bees are using simple departure rules which result in near‐optimal behaviour. I show that a departure rule based on two successive encounters with empty inflorescences closely predicts observed behaviour.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.