In 1998, we reported that increasing access time to cocaine (from 1 h to 6 h per day) precipitates a rapid escalation of intake that is associated with a vertical upward shift in the dose–injection curve, without detectable parallel shifts to the right or to the left (Ahmed and Koob 1998, 1999). Zernig et al. (2003) have suggested that this post-escalation vertical shift may result from tolerance to the rate-suppressing effects of the drug (e.g. “sedation, induction of stereotypy, aversion, etc”) and not—as suggested by other researchers—from tolerance to its rewarding effects or sensitization to its incentive effects (references in Zernig et al.). Unfortunately, Zernig et al. misinterpret our own hypothesis of drug intake escalation and misrepresent it as another version of the sensitization hypothesis of drug addiction. Briefly, according to our reward allostasis hypothesis, prolonged drug exposure induces a chronic decrease in baseline reward sensitivity, a process called reward allostasis (Koob and Le Moal 1997, 2001). As a result, the user becomes both increasingly motivated to seek the reward-facilitating effects of the drug (to avoid anhedonia and recover initial sensitivity) and increasingly tolerant to these effects (due to the shift in baseline sensitivity). In the following comments, we explain why the model offered by Zernig et al. fails to provide a more parsimonious explanation of our data than our reward allostasis hypothesis. According to the model by Zernig et al., dose–response rate functions for drug self-administration represent the net effect of two antagonistic, S-shaped dose-dependent processes (with same efficacy but different potency), one increasing, the other suppressing responding for the drug. The resulting dose–response rate curve is an inverted U. For Zernig et al., the ascending limb of the dose–response curve “is easy to interpret”: the reinforcing effects of the drug increase with the unit dose and “thus leads to an increase in the rate of responding.” In the model by Zernig et al., simulating tolerance to the rate-suppressing effects of the drug results in both an upward and a rightward shift of the theoretical inverted U (Fig. 2C of Zernig et al.). They consider this complex shift as a theoretical approximation of our experimental data, although we did not observe any shift of the dose– injection curve to the right after escalation of cocaine selfadministration (compare Fig. 1C with Fig. 2C of Zernig et al.). Despite its apparent simplicity, the model by Zernig et al. misses the target. Escalation of cocaine intake and associated vertical shifts in dose–injection curves were observed under a schedule of continuous reinforcement (Ahmed and Koob 1998, 1999). Under this type of schedule, the dependent variable is the rate of injections, not the (inter-reinforcement) rate of responses, as wrongly assumed by Zernig et al. The rate of injections decreases with the duration of the rewarding effects of the drug which increases with the unit dose. Resulting dose– injection curves are predominantly characterized by a descending limb that covers the largest and most meaningful range of doses (Fig. 1C, D of Zernig et al.). Ascending limbs are rarely observed and difficult to interpret. In contrast, the descending limb of the dose– injection curve “is easy to interpret”: it reflects the behavioral regulation which allows the individual user to maintain the cumulative effect of the drug above some set level, at least down to a certain threshold dose, below which the rate of injections breaks down (Fig. 1C of Zernig et al.). At this threshold dose, the required rate of injections is probably higher than the maximum rate This reply refers to the letter http://dx.doi.org/10.1007/s00213-0031601-0
Read full abstract