The effect of type of behavior on behavior change caused by type of upcoming food substance

The effect of type of behavior on behavior change caused by type of upcoming food substance

Learning and Motivation 34 (2003) 325–340 www.elsevier.com/locate/l&m The effect of type of behavior on behavior change caused by type of upcoming foo...

224KB Sizes 0 Downloads 24 Views

Learning and Motivation 34 (2003) 325–340 www.elsevier.com/locate/l&m

The effect of type of behavior on behavior change caused by type of upcoming food substance Jeffrey N. Weatherly,* Emily I.L. Arthur, and Kelsey K. Lang Department of Psychology, University of North Dakota, Grand Forks, ND 58202-8380, USA Received 21 January 2003; received in revised form 30 April 2003

Abstract Previous research suggests that rats will decrease their consumption of a low-valued substance if a high-valued one will soon be available (anticipatory contrast), but will increase their rate of operant responding for a low-valued substance if a high-valued one will soon be available (positive induction). The present experiments tested whether rats would increase their operant rate of licking or lever pressing for 1% liquid-sucrose reinforcement when 32% sucrose reinforcement was upcoming in the same session. Results indicated that upcoming 32% sucrose increased rates of lever pressing for 1% sucrose, but did not produce similar increases in rates of licking. In fact, upcoming 32% sucrose significantly reduced lick rates in Experiment 2. The present results suggest that the different changes in behavior may be linked to the specific response that the subjects must engage in to obtain the reward (i.e., licking vs. lever pressing), and not to the function of the behavior (i.e., consummatory vs. operant) or to how frequently the substances are available (i.e., continuously vs. intermittently). Ó 2003 Elsevier Inc. All rights reserved.

The study of animal behavior is of interest for many reasons, one of which is that sometimes animals behave in seemingly counter-intuitive ways. One such example is the seemingly disparate findings of negative consummatory contrast (i.e., anticipatory contrast; Flaherty, 1996) and positive induction. Specifically, numerous studies on anticipatory contrast have shown that when subjects, typically rats, are given the *

Corresponding author. Fax: 1-701-777-3454. E-mail address: jeff[email protected] (J.N. Weatherly).

0023-9690/$ - see front matter Ó 2003 Elsevier Inc. All rights reserved. doi:10.1016/S0023-9690(03)00023-7

326

J.N. Weatherly et al. / Learning and Motivation 34 (2003) 325–340

opportunity to consume a substance low in hedonic value, they will consume less of it if they will soon have the opportunity to consume a substance high in hedonic value versus when they will have continued opportunity to consume the low-valued one (e.g., Flaherty & Checke, 1982; see Flaherty, 1996, for a review). Conversely, studies on positive induction have shown that when rats are given the opportunity to press a lever to obtain a substance low in hedonic value, they will press the lever at higher rate if they will soon have the opportunity to press a lever to obtain a substance high in hedonic value versus when they will have continued opportunity to lever press for the low-valued one (e.g., King, Brandt, & Weatherly, 2002; Weatherly & Moulton, 2001; Weatherly, Stout, McMurry, Rue, & Melville, 1999). The results of King et al. (2002) further increase the intrigue of these different effects. They demonstrated that both anticipatory contrast and positive induction can be observed in the same experimental subjects using the same appetitive substances. In their Experiment 3, rats experienced a 3-min period in which 1% liquid sucrose could be freely consumed. This period was followed, 15 s later, by a second 3-min period. The second 3-min period allowed free access to 1% sucrose in the baseline conditions and to 32% sucrose or 45-mg food pellets, in different treatment conditions. After training in each condition, consumption of 1% sucrose in the first 3-min period was significantly lower when either 32% sucrose or food pellets would be available in the final 3-min period than when 1% sucrose would again be available. King et al. then had the same subjects lever press for rewards delivered by a random-interval (RI) 30-s schedule during 30-min sessions. The reward delivered in the first half of the session (i.e., 15 min) was always 1% liquid sucrose. The reward in the second half of the session was 1% sucrose in the baseline conditions and 32% sucrose or a 45-mg food pellet, in different treatment conditions. After training in each condition, rates of lever pressing for 1% sucrose in the first half of the session were significantly higher when either 32% sucrose or food pellets would be the reward in the second half of the session than when 1% sucrose would continue to serve as the reward. It is not immediately clear why these animals decreased their consumption of, but increased their operant response rates for, 1% sucrose when the same substances were upcoming. However, there are several potential explanations. One possibility is that the different changes in behavior occurred because the two behaviors differed in their function (consummatory vs. operant). For example, it is possible that the animals decreased their consumption of 1% sucrose when a substance high in hedonic value was upcoming because doing so allowed maximum intake of the highly valued substance when it became available. On the other hand, they may have increased their rate of operant behavior when a substance high in hedonic value was upcoming because such an increase allowed them to optimally respond for the highly valued substance when it became available. Additionally, some researchers (e.g., Heinrichs, 2001) have argued that ingestive behaviors such as drinking are unlearned (at least in rodents). If so, consummatory vs. operant behavior may differ because the former is unconditioned whereas the latter is conditioned. A second possibility is that the different changes in behavior occurred because of differences between licking, the dependent measure in studies of anticipatory

J.N. Weatherly et al. / Learning and Motivation 34 (2003) 325–340

327

contrast, and lever pressing, the dependent measure in studies of positive induction. Measuring these different responses may lead to different results for several reasons. For one, it is possible that these behaviors are differentially sensitive to environmental contingencies. Another possibility is that these behaviors change in different ways because of the rate that animals engage in them (i.e., the baseline rates of responding). Rats can lick at a very high rate (e.g., 10 licks/s). This high rate may create a ceiling effect making decreases in responding (contrast) far more probable than increases (induction). Relative to licking, rats lever press at a low rate. Because of the absence of a potential ceiling effect, rates of lever pressing may be more likely to increase than decrease. Lastly, several recent studies (Clement, Feltus, Kaiser, & Zentall, 2000; Kacelnik & Marsh, 2002) have demonstrated that animals acquire a preference for stimuli that have been associated with high levels of effort. If lever pressing is more effortful than licking, then increases in the rate of responding (such as those that occur when the high-valued reward is available) would lead to a different change in lever pressing than licking, assuming that the operandum served as the stimulus cue for the effort. Colloquially, induction might occur for lever pressing because the lever comes to serve as a stimulus for hard work that has been rewarded, thus leading subjects to press it more often. A similar change may not occur for licking because licking is not hard work. Finally, the different changes in behavior may not be tied to the behaviors at all, but rather to differences in how the substances are delivered. In consummatory procedures, each response results in the obtainment of the substance. In induction procedures, the substance is available on an intermittent schedule of reinforcement and thus each response is not necessarily rewarded. The difference between observing contrast or induction may therefore be related to whether the substances are delivered at high (contrast) or low rates (induction). Intuitively, this explanation would seem to make sense. When the substances are delivered at high rates, increasing the amount of the low-valued substance that is obtained could inhibit the amount of the upcoming, high-valued substance that could be consumed. On the other hand, when the substances are delivered at low rates, increases in the rate of behavior will have little influence on how much of the low-valued substance is obtained, but could ensure that the maximum amount of the high-valued substance is obtained once it becomes available. The present study was designed as an initial investigation into which of the above possibilities might account for the different changes in behavior observed in anticipatory contrast and positive induction procedures. Rats were reinforced on a RI schedule during 30-min sessions. The reward in the first half of the session was always 1% liquid sucrose. The reward in the second half of the session was either 1% or 32% liquid sucrose, depending on the condition. The rate of reinforcement in effect during the session varied across conditions from high (RI 7.5-s schedule) to low (RI 60-s schedule). Finally, two different groups of rats were employed for each experiment. One group licked to obtain the rewards whereas the other group lever pressed. This procedure should help elucidate the factors contributing to the observance of the different effects. If observing contrast or induction depends upon whether the animal is engaging in consummatory or operant behavior, then the present procedure should always result in induction. That is, although one group of rats in each

328

J.N. Weatherly et al. / Learning and Motivation 34 (2003) 325–340

experiment engaged in licking behavior, licking was reinforced on an intermittent schedule and the measure of behavior under study was the rate of licking on that schedule (i.e., rates of licking when subjects were actually consuming sucrose were excluded from the analyses). Alternatively, if the observance of contrast or induction depends upon the response the subjects were required to make (i.e., licking a spout or pressing a lever), then the present procedure should produce contrast in the group of rats that licked for their rewards and induction in the group that lever pressed for their rewards. Lastly, if the rate at which the substances are delivered contributes to the different changes in behavior, then one might expect to observe the different effects across the different rates of reinforcement. A contrast effect for both licking and lever pressing might be observed at high rates of reinforcement and an induction effect observed for both responses at low rates.

Experiment 1 In Experiment 1, rats either licked or lever pressed for liquid-sucrose reinforcers delivered during 30-min sessions. Across conditions, rewards were delivered by a RI 7.5-, 15-, 30-, or 60-s schedule. The size of the sucrose reward also varied across conditions so that the same amount of liquid was scheduled in every condition. Doing so potentially allowed for determining the influence of how often the sucrose was delivered independent of how much of it was available. Method Subjects The subjects were 16 experimentally experienced male Sprague–Dawley rats. They were originally obtained from the Center for Biomedical Research on the campus of the University of North Dakota. Subjects were approximately one year of age at the inception of the experiment and all had experience lever pressing for sucrose delivered on interval schedules of reinforcement. Subjects were continuously maintained at approximately 85% of their free-feeding body weights by post-session feedings, when necessary, and by daily feedings on days in which sessions were not conducted. Because the subjects were experimentally experienced, their free-feeding weights had been established prior to the present experiment; those weights were maintained. Subjects were individually housed with water freely available in the home cage (only). They experienced a 12/12 h light/dark cycle, with lights on at 07:00. All sessions were conducted during the light portion of the cycle. Maintenance of the animals complied with the ethical guidelines for the care and use of animal subjects (National Research Council, 1996). Apparatus Two experimental chambers (Coulbourn Instruments) were used, each measuring 30.5 cm (L)  25 cm (W)  28.5 cm (H). The chambers were identical with the exception that one was equipped with response levers and the other with optical

J.N. Weatherly et al. / Learning and Motivation 34 (2003) 325–340

329

lickometers. For the chamber with the response levers, two 3.5-cm (W)  0.1-cm (H) response levers that extended 2.0 cm into the chamber were located on the front panel. The levers were positioned 6.5 cm above the grid floor and 2.5 cm from the left or right walls. Each lever could be depressed by a force of approximately 0.25 N. Five centimeters above each lever was a panel of three colored stimulus lights, each 0.6 cm in diameter. A yellow light was centered above the lever, with a red and green light located 0.6 cm to the left and right, respectively. Centered on the front panel, 2.0 cm above the floor, was a 3.25-cm (W)  3.75-cm (H)  2.5-cm (D) opening that allowed access to a feeding trough into which reinforcers were delivered. Liquid reinforcers were delivered to the trough via a syringe pump located outside the apparatus. A 1.5-cm diameter houselight that provided general illumination in the chamber was centered on the back wall of the chamber, 2.5 cm below the ceiling. For the chamber with the optical lickometers, a 3.0-cm (W)  4.0-cm (H)  3.6-cm (D) opening located 5.0 cm above the grid floor appeared on the front panel in place of the response levers. There was a 1.0-cm-diameter drinking spout located 2.0 cm into each opening. Directly in front of the spout, on both sides, was a photo cell that recorded each lick of the spout. The lickometers were capable of measuring up to 10 licks/s. Each spout was connected to a syringe pump located outside of the apparatus. A panel of three stimulus lights, identical to that described above, was located above each lickometer. Because liquid rewards were delivered directly to the spouts located in the lickometers, the trough used in the chamber with the response levers was not present in the chamber with the lickometers. Otherwise, the chambers were identical. Each chamber resided in a sound-attenuating cubicle, with a ventilation fan masking noise from the outside. Experimental events were programmed, and data were recorded, by an IBM-compatible computer that ran Graphic State Software and was connected to a Coulbourn Instruments Universal Linc. The experimental chamber and control equipment were located in adjacent rooms. Procedure Because subjects had experience drinking from spouts (in their home cages) and lever pressing for liquid-sucrose reinforcement, they were immediately placed on the Experimental procedure. Experienced subjects were utilized, rather than obtaining naive subjects, because previous research on anticipatory contrast and positive induction has indicated that both effects can be produced in experienced subjects (e.g., King et al., 2002). The subjects were divided into two groups of eight rats each. One group lever pressed for sucrose reinforcement. The other group licked for sucrose reinforcement. Otherwise, both groups experienced the same procedure described below. Subjects responded in sessions that were 30 min in length. During the first half of the session, they responded on the left lever or lickometer and the reward was always 1% liquid sucrose (v/v with tap water). During the second half of the session, they responded on the right lever or lickometer and the reward was either the same 1% sucrose or 32% sucrose, depending on the condition. During each half of the session, the red/left light above the lever or lickometer that was active was illuminated; the lights above the

330

J.N. Weatherly et al. / Learning and Motivation 34 (2003) 325–340

Table 1 The order of conditions and the rate of reinforcement and type of reinforcer delivered in each half of the session in each condition of Experiment 1 Order

Reinforcement rate (s)

1st half

2nd half

1 2 3 4 5 6 7 8

RI RI RI RI RI RI RI RI

1% 1% 1% 1% 1% 1% 1% 1%

32% sucrose 1% sucrose 1% sucrose 32% sucrose 32% sucrose 1% sucrose 1% sucrose 32% sucrose

7.5 15 30 60 15 7.5 60 30

sucrose sucrose sucrose sucrose sucrose sucrose sucrose sucrose

inactive operandum were extinguished. Subjects could respond on the inactive operandum in the opposing half of the session, but such responses had no consequences. Across conditions, rewards were delivered by a RI 7.5-, 15-, 30-, or 60-s schedule. That is, reinforcers were programmed at a probability of 0.08, 0.04, 0.02, or 0.01 every 0.6 s, respectively, unless a reward had been programmed but not yet collected. In this circumstance, the inter-reinforcer interval did not advance until the programmed reinforcer had been collected. The size of the reward also varied with the schedule of reinforcement. The size of the reinforcer was 0.025, 0.05, 0.1, and 0.2 ml for the RI 7.5-, 15-, 30-, and 60-s schedules, respectively. Thus, regardless of reinforcement schedule, 3.0 ml of liquid reinforcement was scheduled per half. In all conditions, the same rate of reinforcement was in effect in both halves of the session. However, the schedules in each half were independent. Thus, if a reward from the first half of the session had been programmed, but not collected, at the midpoint of the session, it was canceled and the inter-reinforcer interval for the second half of the session was initiated. Neither the session timer nor the inter-reinforcer interval advanced during reinforcer delivery. Thus, each group of subjects responded in a total of eight conditions, which are presented in Table 1 (i.e., a within-subject design). Half of the subjects in each group experienced the conditions in the order they are presented in Table 1. The remaining subjects experienced the conditions in the reverse order. Each condition was conducted for a total of 20 consecutive sessions, with sessions conducted once per day, 5–6 days per week. Results and discussion The results of Experiment 1 were calculated using the final five sessions of each condition.1 Two subjects in the licking group died before completing the experiment. 1 Only the final five sessions were reported to ensure that conclusions were based on steady-state behavior rather than on behavior in transition (i.e., acquisition). Yet unpublished research from our laboratory indicates that positive induction is acquired over successive sessions and is relatively stable at 15 sessions. Research on anticipatory contrast employing the substances used in the present procedure (e.g., see King et al., 2002) indicates that contrast appears early in training and is maintained across many sessions.

J.N. Weatherly et al. / Learning and Motivation 34 (2003) 325–340

331

Fig. 1. Rates of responding (in response/min plotted on a logarithmic ordinate) are presented across successive 5-min intervals in the session for the mean of all subjects in Experiment 1 that were rewarded for lever pressing. Each graph presents the results for a different rate of reinforcement. Each function represents results from the conditions in which 1% sucrose was delivered in both halves of the session (1–1%) or in which 1% and 32% sucrose were delivered in the first and second half, respectively. Error bars represent one standard error of the mean across subjects responding in that particular 5-min interval.

Data from these subjects were not included in the figure or the analyses presented below (i.e., n ¼ 6 for the licking group). All statistical analyses in this paper were considered significant at p < :05. Fig. 1 presents the results from the group that lever pressed for reinforcement. Presented are the rates of responding across successive 5-min intervals of the 30min session. Each graph presents the results for a different rate of reinforcement. Each function represents the mean rate of responding for all subjects in the condition in which 1% sucrose was the reward in both halves of the session (closed squares) or in which 1% and 32% sucrose was the reward in the first and second half, respectively, of the session (open squares). Response rates are plotted on a logarithmic

332

J.N. Weatherly et al. / Learning and Motivation 34 (2003) 325–340

ordinate so that differences in responding at low rates of responding are visually apparent. The error bars represent the standard error of the mean calculated across subjects responding in that particular 5-min interval. The data in Fig. 1 indicate that positive induction was observed for subjects who lever pressed. Specifically, rates of lever pressing in the first half of the session (i.e., the first three 5-min intervals) were higher when 32% sucrose would serve as the reward in the second half of the session than when 1% sucrose would serve as the reward in the second half. Results from statistical analyses were consistent with this conclusion. Rates of lever pressing by individual subjects in the first half of the session in the different conditions were analyzed by conducting a three-way (reinforcement rate by upcoming reward type by 5-min interval) repeated measures analysis of variance (ANOVA). Results showed that the main effect of reinforcement rate was significant (F ð3; 21Þ ¼ 4:07), indicating that subjectsÕ rates of lever pressing varied inversely with rate of reinforcement. The main effect of upcoming reward type was also significant (F ð1; 7Þ ¼ 35:82), indicating that response rates were higher when 32% sucrose was upcoming than when 1% sucrose was upcoming. Finally, the interaction between reinforcement rate and 5-min interval was significant (F ð6; 42Þ ¼ 2:66). Follow-up analyses indicated that responding changed across successive 5-min intervals in the first half of the session in the RI 7.5-s and RI 30-s conditions, but not during the RI 15-s or RI 60-s conditions. No other effects in the omnibus analysis were significant. Fig. 2 presents the results from the group who licked. It was constructed as was Fig. 1. Again, it presents only licks that served as operant responses (i.e., it does not include those licks that occurred during the consumption of the sucrose). It shows that, when subjects responded on the RI 7.5-s schedule, upcoming 32% sucrose appeared to produce a contrast effect in the first half of the session. However, when the rate of reinforcement was decreased to a RI 15-s schedule, an apparent induction effect was observed. Upcoming 32% sucrose appeared to have little or no effect on licking on the RI 30- and 60-s schedules.2 Lick rates in the first half of the session were also analyzed by conducting a threeway (reinforcement rate by upcoming reward type by 5-min interval) repeated measures ANOVA. The only significant effect in this analysis was the interaction between rate of reinforcement and 5-min interval (F ð6; 30Þ ¼ 2:55), indicating that lick rates changed across the first half of the session differently at the different rates of reinforcement. Follow-up analyses indicated that lick rates changed significantly in the first half of the session when subjects responded on the RI 30-s schedule, but not when they responded on the RI 7.5-, 15-, or 60-s schedules. Also, a follow-up 2 It is worthy of note that lick rates in the second half of the 1–1% sessions were far lower than those observed in the first half. Although it is not known exactly why this decrease occurred, similar (albeit smaller) decreases in responding have been reported in studies on anticipatory contrast that have used 1% sucrose (Flaherty, Turovsky, & Krauss, 1994). Furthermore, given that the lick rates in Fig. 2 represent operant responses, such decreases in responding may be consistent with the large literature on withinsession changes in responding (e.g., see McSweeney, Hinson, & Cannon, 1996).

J.N. Weatherly et al. / Learning and Motivation 34 (2003) 325–340

333

Fig. 2. Rates of responding (in response/min plotted on a logarithmic ordinate) are presented across successive 5-min intervals in the session for the mean of all subjects in Experiment 1 that were rewarded for licking. Each graph presents the results for a different rate of reinforcement. Each function represents results from the conditions in which 1% sucrose was delivered in both halves of the session (1–1%) or in which 1% and 32% sucrose were delivered in the first and second half, respectively. Error bars represent one standard error of the mean across subjects responding in that particular 5-min interval.

two-way (upcoming reward type by 5-min interval) repeated measures ANOVA was conducted to determine whether the apparent contrast effect in the RI 7.5-s conditions was significant. No effects were significant. The results of Experiment 1 tentatively indicate that the type of response may determine which effect is observed. Subjects who licked displayed neither contrast nor induction. Subjects who lever pressed, however, always displayed induction. Because operant responding was measured for both groups and because rate of substance availability varied across conditions similarly for both groups, type of response would appear to be the remaining explanation for the difference between groups.

334

J.N. Weatherly et al. / Learning and Motivation 34 (2003) 325–340

One could argue, however, that the results of Experiment 1 do not really allow for an adequate test of substance availability because reward size covaried with reward rate so that the same amount of sucrose was available in every condition. Thus, it is possible that the observance of induction at every rate of reinforcement for the rats who lever pressed occurred because they were receiving similar amounts of sucrose across the different conditions. The absence of significant effects when rats licked may also have occurred because the total amount of sucrose that could be obtained per half was 3 ml. If anticipatory contrast occurs because it allows animals to ‘‘save room’’ for the upcoming substance, then Experiment 1 provided little opportunity for this mechanism to influence behavior because there was little difference across the different rates of reinforcement in terms of how much of the upcoming substance would be consumed.

Experiment 2 Experiment 2 replicated the procedure of Experiment 1 with the exception that reward size was held constant at 0.2 ml per reward across the four different rates of reinforcement. Thus, 24 ml of sucrose was programmed per half at the highest rate of reinforcement while 3 ml was programmed per half at the lowest rate. If the amount of sucrose available for consumption influences the observance of contrast or induction, then different results should appear across the different conditions (and perhaps different results from those observed in Experiment 1). On the other hand, if a difference truly exists between the responses of licking and lever pressing, then results similar to those in Experiment 1 should appear despite changes in the amount of sucrose that is available. Method Subjects and apparatus The subjects were 12 experimentally naive male Sprague–Dawley rats. They were obtained from the same source as those in Experiment 1. The subjects were approximately four months of age at the inception of the experiment. Subjects had been given free access to food prior to the experiment. Their free-feeding weights were determined by a single weighing prior to the experiment. They were then gradually reduced to 85% of their free-feeding weight. Subjects were then allowed a weight gain of 5 g per week until they reached 6 months of age. Otherwise, subjects were housed, maintained, and cared for as were those in Experiment 1. Experiment 2 utilized the same two pieces of apparatus used in Experiment 1. Procedure Subjects were divided into two groups of six rats each. The original group size was six, rather than eight as used in Experiment 1, because the rats were younger than those in Experiment 1 and thus the threat of mortality was not as high. However, one rat in the lever pressing group died before completing the experiment. Thus, the lever pressing group in Experiment 2 had five rats and the licking group had six.

J.N. Weatherly et al. / Learning and Motivation 34 (2003) 325–340

335

Subjects experienced the identical procedure as subjects in Experiment 1 with one exception. Reward size in Experiment 2 was 0.2 ml in each condition. Thus, 24, 12, 6, and 3 ml of sucrose was programmed per half in the RI 7.5-, 15-, 30-, and 60-s schedule conditions, respectively. As is Experiment 1, three subjects experienced the conditions in the order presented in Table 1. The remaining subjects experienced the reverse order. Results and discussion The results for Experiment 2 were again calculated using the final five sessions of each condition. Fig. 3 presents the results for the group that lever pressed. It was constructed as was Fig. 1 and presents similar results. Upcoming 32% sucrose increased rates of lever pressing for 1% sucrose in the first half of the session compared to when continued 1% sucrose was upcoming. Rates of lever pressing by individual subjects were analyzed by conducting a three-way (reinforcement rate by upcoming reward type by 5-min interval) repeated measures ANOVA. In this analysis, the only effect to reach significance was the main effect of upcoming reward type (F ð1; 4Þ ¼ 32:54), indicating that rates of lever pressing were higher when 32% sucrose was upcoming than when it was not. Fig. 4 presents the results for the group that licked. It was constructed as was Fig. 2 (i.e., it only presents operant lick rates). As can be seen in Fig. 4, upcoming 32% sucrose produced a contrast effect at each different rate of reinforcement. A three-way (reinforcement rate by upcoming reward type by 5-min interval) repeated measures ANOVA, conducted on the lick rates in the first half of the session, produced a significant main effect of reinforcement rate (F ð3; 15Þ ¼ 4:44), main effect of upcoming reward type (F ð1; 5Þ ¼ 10:99), main effect of 5-min interval (F ð2; 10Þ ¼ 5:65), interaction between reinforcement rate and upcoming reward type (F ð3; 15Þ ¼ 4:81), and interaction between upcoming reward type and 5-min interval (F ð2; 10Þ ¼ 4:24). The remaining effects were not significant. Because both significant interactions involved upcoming reward type, follow-up analyses were conducted separately on responding in the 1–1% and 1–32% conditions. Analysis of lick rates in the first half of the 1–1% conditions indicated that lick rates increased with decreases in reinforcement rate (F ð3; 15Þ ¼ 4:67) and decreased across the first half of the session (F ð2; 10Þ ¼ 4:97). The interaction between the two was not significant. Analysis of lick rates in the 1–32% conditions produced a significant interaction between rate of reinforcement and 5-min interval (F ð6; 30Þ ¼ 3:48). Tests for simple effects indicated that lick rates decreased significantly across the first half of the session at each rate of reinforcement (all F sð2; 10Þ P 4:12). The results of Experiment 2 question the idea that the available amount of the substance contributes to the presence of contrast or induction. Despite similar changes across conditions in the amount of sucrose that was available, subjects who lever pressed for 1% sucrose increased their rate of responding when 32% sucrose was upcoming while subjects who licked for 1% sucrose decreased their rate of responding. The results support the idea that the presence of contrast or induction is associated with the type of response the animal has to make to obtain the sucrose.

336

J.N. Weatherly et al. / Learning and Motivation 34 (2003) 325–340

Fig. 3. Rates of responding (in response/min plotted on a logarithmic ordinate) are presented across successive 5-min intervals in the session for the mean of all subjects in Experiment 2 that were rewarded for lever pressing. Each graph presents the results for a different rate of reinforcement. Each function represents results from the conditions in which 1% sucrose was delivered in both halves of the session (1–1%) or in which 1% and 32% sucrose were delivered in the first and second half, respectively. Error bars represent one standard error of the mean across subjects responding in that particular 5-min interval.

General discussion The present study was designed to investigate why rats sometimes display a decrease in their rate of behavior (negative contrast) and other times display an increase (positive induction) when a high-valued substance will soon be available. The possibilities were investigated that the function of the behavior (consummatory vs. operant), the type of response (licking vs. lever pressing), and/or the rate at which the substances were delivered (high vs. low) contributed to which effect was observed. Results demonstrated that upcoming 32% sucrose always produced induction when subjects lever pressed for 1% sucrose. Upcoming 32% sucrose did not produce

J.N. Weatherly et al. / Learning and Motivation 34 (2003) 325–340

337

Fig. 4. Rates of responding (in response/min plotted on a logarithmic ordinate) are presented across successive 5-min intervals in the session for the mean of all subjects in Experiment 2 that were rewarded for licking. Each graph presents the results for a different rate of reinforcement. Each function represents results from the conditions in which 1% sucrose was delivered in both halves of the session (1–1%) or in which 1% and 32% sucrose were delivered in the first and second half, respectively. Error bars represent one standard error of the mean across subjects responding in that particular 5-min interval.

induction when subjects were rewarded for licking. In fact, it produced contrast in Experiment 2. The implication of these results would seem to be that the type of response plays a primary role in whether contrast or induction is observed. As noted in the introduction, the observance of contrast or induction could potentially be linked to whether subjects were engaging in consummatory or operant behavior. However, all analyses in the present study were conducted on operant responding (i.e., not involved in the actual consumption of the sucrose). Thus, finding different results between groups would seem to rule out the function of the behavior as the differentiating factor. However, it is possible that induction was not

338

J.N. Weatherly et al. / Learning and Motivation 34 (2003) 325–340

observed for licking because the operant response too closely resembled the consummatory response for this group (i.e., subjects consumed the sucrose rewards from the same spout they licked to produce them). The fact that the size of contrast for the licking group in Experiment 2 (see Fig. 4) decreased with decreases in reinforcement rate (and thus the number of ‘‘consummatory’’ licks made per session) may lend support to this idea. It was also noted that how frequently the substances were made available could play a role in whether contrast or induction was observed. Despite altering the availability of sucrose across conditions similarly for both groups, the groups behaved differently. Before substance availability is ruled out completely, however, it should be noted that all four rates of reinforcement in the present study were intermittent schedules. It remains possible that anticipatory contrast is promoted by the use of a continuous schedule of reinforcement. The present data seem to point toward the type of response as the determining factor for whether contrast or induction is observed. The question that remains is what theoretical mechanism could explain this outcome? As noted in the introduction, one possibility is that these responses differ greatly in terms of the rate at which the animal can engage in them. Rats can lick at very high rates, which may make decreases in responding more likely than increases. Rates of lever pressing for 1% sucrose usually occur at low rates, which may make increases in responding more likely than decreases. The present results, however, seem to question this potential explanation. Fig. 4 shows that subjects were capable of maintaining lick rates in excess of 100 licks per min for 15 consecutive min when responding for 32% sucrose. Lick rates for the 1% sucrose were always much lower than that, arguably indicating that although induction was not observed, it was possible (i.e., subjects could have licked at a higher rate than they did). Another possibility is that the responses differ in the amount of effort they require. Recent research (Clement et al., 2000; Kacelnik & Marsh, 2002) suggests that animals learn to prefer stimuli that have been associated with increased levels of effort. The induction in Figs. 1 and 3 may have occurred because, when subjects increased their rates of lever pressing for 32% sucrose, their overall ‘‘effort’’ also increased. If the lever itself served as the stimulus cue for this increased effort, then subjects may have pressed it at an increased rate the next time they were in its presence (i.e., at the beginning of the next session). The same effect may not have occurred for licking because high rates of licking may not be overly effortful. Although the present data cannot confirm or disconfirm this particular explanation, several pieces of evidence exist to begin its evaluation. In support of this idea is the finding that induction was present at the very beginning of the session in sessions in which 32% sucrose was upcoming. Furthermore, the size of the induction was relatively constant across the first half of the session. These results suggest that the induction did not occur because subjects were ‘‘timing’’ the availability of the upcoming 32% sucrose. Rather, they suggest that the cause of the induction was present at the very beginning of the session. However, the induction itself might question this explanation. Subjects responded on different levers in the different halves of the session. Thus, it is not clear why the lever used in the first half of

J.N. Weatherly et al. / Learning and Motivation 34 (2003) 325–340

339

the session would become the stimulus cue for increased work when the bulk of the work was done on the other lever (although generalization may have occurred). In addition, this explanation would not explain why anticipatory contrast was observed for the licking group in Experiment 2. A third potential difference that may have contributed to the different findings for licking and lever pressing is the difference in response-reinforcer contiguity. That is, because the reinforcers for the licking groups were delivered directly to the spout, the time between the reinforced response and the consumption of the sucrose was very short. Subjects in the lever pressing group experienced a long delay, relative to the licking group, between the reinforced response and consuming the sucrose. Although it would be difficult to shorten this delay for the lever pressing group, it may be possible to lengthen the delay in the licking group in future studies. If response-reinforcer contiguity is important, than doing so may promote the appearance of induction when subjects lick. Future studies will also want to investigate what the induction effect actually represents. That is, the decreases in lick rates observed in studies of anticipatory contrast represent decreases in consumption. However, because in studies of positive induction reinforcers are delivered by an intermittent schedule of reinforcement, increases in rates of lever pressing do not necessarily result in increased delivery of the reinforcer. But there are several potential reasons for why this increase may occur. For instance, through the pairing of 1% sucrose with upcoming 32% sucrose reinforcement, it is possible that induction occurs because the incentive value of the 1% sucrose has been increased. Likewise, because the experimental context gets paired with the 32% sucrose, the context may gain stimulus control and cause a general increase in behavior (including lever pressing). Finally, it is possible that induction is the outcome of an increase in the probability of operant responding. Recent research (Weatherly, Plumm, Smith, & Roberts, 2002) would appear to favor the latter explanation. However, the empirical data necessary to parse apart these potential explanations do not yet exist.

Acknowledgments The authors thank Brent M. King for his help in data collection. Portions of this material were supported by the National Science Foundation under Grant no. #0132289.

References Clement, T. S., Feltus, J. R., Kaiser, D. H., & Zentall, T. R. (2000). ÔWork ethicÕ in pigeons: Reward value is directly related to the effort or time required to obtain the reward. Psychonomic Bulletin & Review, 7, 100–106. Flaherty, C. F. (1996). Incentive relativity. Cambridge, UK: Cambridge University Press. Flaherty, C. F., & Checke, S. (1982). Anticipation of incentive gain. Animal Learning & Behavior, 10, 177–182.

340

J.N. Weatherly et al. / Learning and Motivation 34 (2003) 325–340

Flaherty, C. F., Turovsky, J., & Krauss, K. L. (1994). Relative hedonic value modulates anticipatory contrast. Physiology & Behavior, 55, 1047–1054. Heinrichs, S. C. (2001). Mouse feeding behavior: Ethology, regulatory mechanisms and utility for mutant phenotyping. Behavioural Brain Research, 125, 81–88. Kacelnik, A., & Marsh, B. (2002). Cost can increase preference in starlings. Animal Behaviour, 63, 245–250. King, B. M., Brandt, A. E., & Weatherly, J. N. (2002). Up or down: The influence of upcoming reinforcement on consummatory and operant behavior. Journal of General Psychology, 129, 443–461. McSweeney, F. K., Hinson, J. M., & Cannon, C. B. (1996). Sensitization-habituation may occur during operant conditioning. Psychological Bulletin, 120, 256–271. National Research Council (1996). Guide for the care and use of laboratory animals. National Academy Press, Washington, DC. Weatherly, J. N., & Moulton, P. L. (2001). The effect of food-pellet reinforcement on ratsÕ rates of lever pressing for 1% sucrose reinforcers across several ‘‘contrast’’ procedures. Learning and Motivation, 32, 193–218. Weatherly, J. N., Plumm, K. M., Smith, J. R., & Roberts, W. A. (2002). On the determinants of induction in responding for sucrose when food pellet reinforcement is upcoming. Animal Learning & Behavior, 30, 315–329. Weatherly, J. N., Stout, J. E., McMurry, A. S., Rue, H. C., & Melville, C. L. (1999). Within-session responding when different reinforcers are delivered in each half of the session. Behavioural Processes, 46, 227–243.