Pigeons were trained on choice procedures in which responses on each of two keys were reinforced probabilistically, but only after a schedule requirement had been met. Under one arrangement, a fixed-interval choice procedure was used in which responses were not reinforced until the interval was over; then a response on one key would be reinforced, with the effective key changing irregularly from interval to interval. Under a second, fixed-ratio choice procedure, responses on either key counted towards completion of the ratio and then, once the ratio had been completed, a response on the probabilistically selected key would produce food. In one experiment, the schedule requirements were varied for both fixed-interval and fixed-ratio schedules. In the second experiment, relative reinforcement rate was varied. And in a third experiment, the duration of an intertrial interval separating choices was varied. The results for 11 pigeons across all three experiments indicate that there were often large deviations between relative response rates and relative reinforcement rates. Overall performance measures were characterized by a great deal of variability across conditions. More detailed measures of choice across the schedule requirement were also quite variable across conditions. In spite of this variability, performance was consistent across conditions in its efficiency of producing food. The absence of matching of behavior allocation to reinforcement rate indicates an important difference between the present procedures and other choice procedures; that difference raises questions about the specific conditions that lead to matching as an outcome.