Journal of Experimental Social Psychology Journal of Experimental Social Psychology 40 (2004) 606–618 www.elsevier.com/locate/jesp
From thinking about what might have been to sharing what we know: The effects of counterfactual mind-sets on information sharing in groupsq Adam D. Galinskya,* and Laura J. Krayb a
Department of Management and Organizations, Kellogg School of Management, Northwestern University, Leverone Hall, 2001 Sheridan Road, Evanston, IL 60208, USA b Organizational Behavior and Industrial Relations Group, Haas School of Business, University of California at Berkeley, Berkeley, CA, USA Received 19 April 2002; revised 28 October 2003 Available online 21 January 2004
Abstract We hypothesized that the activation of a counterfactual mind-set minimizes group decision errors caused by the failure of groups to discuss unshared, uniquely held information. In two experiments, we manipulated the salience of counterfactual thoughts in a pre-task scenario and then had groups of three individuals discuss a murder mystery case. In both experiments, counterfactual mind-sets increased the discussion of unshared information and helped groups to identify the correct murder suspect. These results emerged regardless of whether the direction of the counterfactual thoughts was upward (Experiment 1) or downward (Experiment 2), suggesting that it is the process of thinking counterfactually, and not the content of the counterfactuals, that improves group decision making. Ó 2003 Elsevier Inc. All rights reserved.
Complex decisions, from the economic (e.g., whether and how much to raise interest rates) to the legal (e.g., whether to indict or convict an individual) to the military (e.g., when and where to launch an attack) are often made by groups. As these examples suggest, such decisions can have far-reaching consequences, affecting such important outcomes as economic prosperity, incarceration, and even mortality. The explosion of the space shuttle Challenger, a mere 73 s after its liftoff, is one poignant example of group decision making gone awry. The Presidential Commission investigating the accident reported that inadequate sharing of information was an important contributor to the disaster. Although data were available showing that low temperatures could cause malfunctions in the shuttle, this information was not widely disseminated (Report of the Presidential q This research was supported in part by a grant from the National Science Foundation (SES-0233294 and SES-0136931 awarded to both authors). * Corresponding author. E-mail addresses:
[email protected] (A.D. Galinsky),
[email protected] (L.J. Kray).
0022-1031/$ - see front matter Ó 2003 Elsevier Inc. All rights reserved. doi:10.1016/j.jesp.2003.11.005
Commission on the Space Shuttle Challenger Accident, 1986). Key decision makers were never made aware of the information, so they decided to proceed with the tragic launch, despite an air temperature that dipped below freezing that morning. As these examples suggest, group members must find ways to gather relevant information from each other in an efficient manner. Unfortunately, group discussions are characterized by the tendency to focus on shared rather than unshared information—groups tend to focus on what everyone knows rather than on what only some members know (Larson, Foster-Fishman, & Keys, 1994; Stasser & Stewart, 1992; Stasser & Titus, 1985; Winquist & Larson, 1998). This tendency means that group decisions are often biased in the direction of shared information. In this paper, we suggest that thinking about what might have been (counterfactual thoughts) can help to solve this problem. Counterfactual thinking can increase the sharing and discussion of unshared information and ultimately improve decision accuracy. The following experiments extend theory and research in several important ways. We explore whether
A.D. Galinsky, L.J. Kray / Journal of Experimental Social Psychology 40 (2004) 606–618
cognitive orientations, or mind-sets, that are activated in one context can affect group decisions in a later, unrelated context. Previous research has provided suggestive evidence that counterfactual thinking increases the tendency for individuals to consider alternatives and engage in mental simulation during subsequent decision making (Galinsky & Moskowitz, 2000), thereby improving decision accuracy. However, this research did not measure the construction of counterfactual thoughts. We therefore wondered whether it was the amount of counterfactual activation that predicts decision accuracy and thus serves as the source of debiasing. Another goal of our experiments was to extend previous research on counterfactual thinking to the group domain. Because counterfactual mind-sets increase the consideration of alternatives, they may well lead group members to share unique information, which should have a debiasing effect on group decisions. Although some research has shown that the framing of group decisions can affect decision accuracy (Stasser & Stewart, 1992), few researchers have explored whether cognitive orientations can be activated prior to and independent of group decision making. Finally, we sought to demonstrate that it is the process of thinking counterfactually and not the content or direction of counterfactual thoughts that affects information sharing during group decision making.
Increasing information sharing What accounts for biased information sampling in groups? Structural, cognitive, and motivational factors all appear to contribute to the bias. Based simply on probability sampling, items that are known to more group members are statistically more likely to be mentioned during group discussion. Because shared information comprises a larger proportion of the total information available to group members, they may produce a tentative hypothesis that is consistent with that information. Given that groups are prone to test hypotheses in ways that are likely to confirm them (Schulz-Hardt, Frey, Luthgens, & Moscovici, 2000), unshared information is thus unlikely to be considered. Even when unshared information is considered, however, shared information still receives greater attention (Larson et al., 1994), suggesting an additional, motivational basis for biased information sampling. Consider groupthink (Janis, 1982), a biased decision making process in which pressures toward consensus override other concerns including the quality of the decision. Groupthink can lead individuals to remain silent regarding their unique, privately held information. How can groups be led to discuss more unshared information? Although modifying the structure or con-
607
text of group discussions might seem to be useful tactics, many such modifications have proven to have weak or no effects, and some of them have ironically made the bias even worse. For example, separating the discussion of information from the final decision fails to weaken the bias (Stasser, Taylor, & Hanna, 1989). And making a decision seem more important has a counterproductive effect, slowing the rate of information dissemination in groups (Larson et al., 1994). Finally, increasing accountability by requiring a group to justify its decision to someone else actually increases the focus on shared information (Stewart, Billings, & Stasser, 1998). These tactics may be ineffective because they strengthen the motivational basis for biased information sampling by creating pressures toward consensus. Activating cognitive mind-sets that redirect the focus of group discussion to critical, unshared information might be a better tactic. The norms that govern group interaction, and the framing of group tasks, have both been shown to affect the discussion of unique information. For example, Postmes, Spears, and Cihangir (2001) found that creating group norms (in a prior context) that promoted critical thinking and debate, rather than group cohesion, led to greater discussion of unshared information and more decision accuracy. And Stasser and Stewart (1992) found that framing a decision as a problem that can be solved, rather than just a matter of opinion, improved information sharing during group discussion. Stasser and Stewart argued that a problemsolving frame leads groups to focus on critical pieces of information, rather than seeking consensus. Given the powerful effect that frames and norms can have on directing the flow of information, we wondered whether there are other mind-sets that might weaken the tendency for groups to focus on shared information, and if so, whether these mind-sets can be activated prior to and thus independent of the subsequent decision-making context.
Counterfactual mind-sets Counterfactuals are thoughts about what might have been—they represent alternative realities for past events. Counterfactual thoughts are often characterized by expressions of ‘‘if only. . .’’ (Roese, 1994). Through counterfactuals, people reconstruct the past. Counterfactual thoughts are often activated when an event nearly occurred (Kahneman & Varey, 1990; Miller & McFarland, 1986) or when antecedents of that event were exceptional in some way (Kahneman & Miller, 1986; Kahneman & Tversky, 1982). Thoughts about what might have been can also have an influence on future behavior (Galinsky, Seiden, Kim, & Medvec, 2002; Roese, 1994). For example, Roese (1994) found that asking people to construct
608
A.D. Galinsky, L.J. Kray / Journal of Experimental Social Psychology 40 (2004) 606–618
counterfactual thoughts after one anagram task led to better performance on a subsequent anagram task. Performance improved because counterfactual thinking led people to specify the necessary conditions for avoiding previous errors. Asking people to think about what might have been in one context can even affect subsequent decision making and problem solving in an unrelated context. Galinsky and Moskowitz (2000) argued that a salient counterfactual raises awareness of multiple options which can lead people to make better decisions. They tested this claim using a hypothesis-testing paradigm. Many researchers have found that hypothesis testers display a confirmation bias—they tend to seek evidence that confirms their hypotheses and neglect evidence that disconfirms them (Pyszczynski & Greenberg, 1987; Snyder & Swann, 1978), paying little attention to alternative hypotheses (Trope & Liberman, 1996). In Galinsky and MoskowitzÕs research, exposure to a counterfactual in an earlier, unrelated context led people to ask more hypothesis-disconfirming questions, presumably by increasing the accessibility of alternative hypotheses. Our notion of a counterfactual mind-set is closely related to the simulation heuristic (Kahneman & Tversky, 1982). Kahneman and Tversky originally discussed counterfactual thinking and mental simulation in terms of the availability heuristic. According to their formulation of that heuristic (Tversky & Kahneman, 1973), two kinds of mental operations can bring thoughts to mind: the retrieval of instances from memory and the mental construction of scenarios or examples. They named the latter process the ‘‘simulation heuristic,’’ because complex questions are asked and answered about both future and past events by running a mental simulation or model. One could, through the process of mental simulation, assess the probability that a particular plan will succeed, evaluate alternatives, and identify the various risks involved in a course of action. People do not tend to spontaneously generate alternatives, but instructions to generate one alternative often lead to the spontaneous generation of other alternatives (Hirt & Markman, 1995). Once activated, the simulation heuristic thus helps people to consider alternative possibilities that they might otherwise neglect.
Counterfactual mind-sets and group discussion Stasser and Stewart (1992) demonstrated that how decisions are framed affects the cognitive orientations, or mind-sets, of group members towards a task. We believe that counterfactual mind-sets, like group norms (Postmes et al., 2001), can be activated prior to and independent of a group decision-making session. Be-
cause priming counterfactual thinking promotes an enduring cognitive orientation (Galinsky & Moskowitz, 2000), we expect counterfactual activation to improve group decision accuracy, even when the decision-making context is functionally unrelated to the context in which counterfactual thoughts were activated. The fact that counterfactual mind-sets can affect individual decision making does not necessarily mean that they will have parallel effects on group decision making. In fact, manipulations that affect individual decision making sometimes have no effect on group decision making, or even have effects opposite to those that they have on individual decision making. For example, accountability manipulations seem to have helpful effects on individual decision making (Tetlock, 1992), but harmful effects on group decision making (see Stewart et al., 1989). And creating group norms in one context has more impact when the same group engages in discussion and makes a decision than when each group member makes a similar decision individually (Postmes et al., 2001). Finally, increased attentional demands during group discussion might limit the ability of counterfactual mind-sets to affect decision making in groups. Many cognitive processes are disabled when concurrent tasks and attentional demands diminish cognitive capacity (Gilbert, 1989; Martin, 1986). Therefore, it is important to explore the effects of counterfactual mind-sets on group as opposed to individual decision making. Both of our experiments involved a group decision making task in which the available information was dispersed among group members, making them highly interdependent. Groups were not told that individual members possessed unique information. In this way, the spontaneous processes of searching for and sharing information among group members under counterfactual and non-counterfactual mind-set conditions could be examined.1 Because all groups had the same information, it was possible to see how counterfactual mind-sets affected group discussions and decision accuracy. A focus on shared information during group discussions is particularly destructive when there is a hidden profile. A hidden profile occurs when the best decision alternative cannot be identified unless unshared information is considered (Stasser, 1999; Stasser & Stewart, 1992). If counterfactual mind-sets raise awareness about alternative realities, then they should help groups deal with hidden profiles by increasing the discussion of
1 Stasser, Stewart, and Wittenbaum (1995) manipulated whether participants were or were not warned that some of them might have unique information and found no effect of this manipulation on decision making.
A.D. Galinsky, L.J. Kray / Journal of Experimental Social Psychology 40 (2004) 606–618
unshared information and thereby improving the accuracy of group decisions.
Experiment 1 Method Participants Participants were 90 MBA students enrolled in an introductory organizational behavior course. The classes were comprised of approximately 70% men. The group decision was part of a classroom exercise conducted during the third week of the (15-week) course. Each group contained three persons who were working together for the first time. Students were randomly assigned to groups and groups were randomly assigned to conditions. Decision task We used the decision task described in Stasser and Stewart (1992). Participants read a series of interviews from a homicide investigation. These interviews were presented in a booklet that included other supporting materials, such as a map and a newspaper article. The materials contained clues that were either incriminating or exonerating for each of three male suspects (E, B, and M). The number of incriminating clues for suspects E, B, and M was the same, but the number of exonerating clues differed. There were two exonerating clues for Suspect M, three exonerating clues for Suspect B, and no exonerating clues for Suspect E. A fourth suspect (G), the victimÕs wife, was included to increase the difficulty of the task. Because the victim was involved in an extra-marital affair, participants were often suspicious of G, even though there was no direct incriminating information about her. If all the evidence were considered, then it should be clear that Suspect E had both the motive and the opportunity to commit the crime, and that he attempted to frame Suspect B. Two exonerating clues about Suspect M, three exonerating clues about Suspect B, and one incriminating clue about Suspect E were critical for identifying Suspect E as the guilty party. To create a hidden profile, we distributed these clues so that they were unshared. In each group, one member received two clues that exonerated suspect M, another member received one clue that exonerated Suspect B and one clue that incriminated Suspect E, and the third member received two clues that exonerated Suspect B. Collectively, group members had all of the necessary information to solve the crime. Procedure At the beginning of the class session, each participant was handed a packet. Participants were told that they would be engaging in a three-person decision-making
609
task.2 Instructions and a group identification number were presented on the first page of the packet. This number corresponded to a room in which each person would meet with the other group members to conduct a discussion and make a decision. Before the groups met, participants were given 20 min to read their booklets and take notes, which they could bring to the group meeting. They were advised to read the materials carefully, because they could not bring the booklet to that meeting. After 20 min passed, participants were asked to (individually) select ‘‘the one suspect you believe murdered Robert Guion’’ and to provide a brief justification for their choices. We included this individual measure to ensure that there were no pre-discussion differences in choice across groups. After going to its meeting room, each group was given a single, group decision booklet. Counterfactual prime Before beginning the decision task, groups were told to spend 5 min on a ‘‘team-building exercise,’’ which provided the basis for the counterfactual prime manipulation. Participants in the counterfactual prime condition (n ¼ 16 groups) read a brief story about a woman (Sue) who was attending a rock concert of her favorite band. Because her ticketed seat was not very close to the stage, Sue moved to a vacant seat in the third row. Shortly thereafter, the emcee announced that a valuable prize would be awarded to one lucky winner. The emcee then reached into a bin filled with ticket stubs and announced that the winner was the person who currently occupied SueÕs old seat. This story has been shown by other researchers to activate upward counterfactual thoughts (Galinsky & Moskowitz, 2000)—people are likely to think ‘‘if only Sue had not moved, she would have won.’’ In the non-counterfactual prime condition (n ¼ 14 groups), a similar story was read, except that Sue did not switch seats and the lucky winner was someone in a different seat altogether. All group members read the story together and were then asked as a group to ‘‘list some thoughts running through SueÕs mind.’’ After completing this team-building exercise, group members were instructed to spend up to 20 min discussing the murder case and making a group decision. They were told to ‘‘Please discuss the evidence and, as a group, choose the one suspect you believe committed the murder.’’ All groups used the full 20 min, so discussion time was constant across groups. Discussion of unshared and shared clues When participants returned to the classroom, they completed a questionnaire that asked them to identify 2
During most class sessions, students engaged in a classroom exercise and then heard a debriefing in which relevant course concepts were discussed. So, no special cover story was needed or provided to participants, other than that they would be engaging in a group decision-making exercise.
610
A.D. Galinsky, L.J. Kray / Journal of Experimental Social Psychology 40 (2004) 606–618
clues that were mentioned during their group discussion. This questionnaire was a checklist of clues from the case and included each of the six unshared clues, as well as six randomly selected shared clues that contained implicating information about one of the three main suspects. Thus, this checklist served as our measure of the amount of unshared and shared information that was discussed. Because we wanted this checklist to serve as a measure of information sharing, it was stressed (both orally and in the questionnaireÕs written instructions) that participants should check off only the clues that they actually discussed, not ones that they had merely read. Participants were also asked to rate how confident they were that their groups had selected the correct suspect, using a nine-point Likert scale, with endpoints of 1 (‘‘not at all confident’’) and 9 (‘‘extremely confident’’). After completing their questionnaires, participants were collectively debriefed through a 30-min discussion of the decision task. During the debriefing session, none of the participants seemed to be aware of the research hypotheses. Results and discussion The unit of analysis was the group for all analyses except for the analyses performed on pre-discussion individual choices, which were conducted at the individual level. Two groups (one from each condition) were excluded from the sample because they did not follow the instructions. Counterfactual activation Two independent coders identified the number of counterfactual thoughts that groups listed when they were asked to imagine what might be going through SueÕs mind. The coders were blind to both the research hypotheses and the experimental conditions associated with the responses that they coded. Coders were told to count thoughts as counterfactual if they explicitly referred to how the alternative outcome (winning) might have occurred or if they explicitly referred to a counterfactual emotion (e.g., regret) that Sue might have experienced. The coding reliability was high (intraclass correlation ¼ .88), so the counts from the two coders were averaged. As expected, groups in the counterfactual prime condition ðM ¼ 1:47Þ listed significantly more counterfactual thoughts than did groups in the non-counterfactual prime condition ðM ¼ :62Þ, tð26Þ ¼ 3:14, p < :01 (see Table 1). But maybe counterfactual primes simply led groups to have more thoughts overall, whether they were counterfactual or not. To rule out this alternative explanation, we computed the mean total number of thoughts in each condition. There was a marginally significant difference between conditions, tð26Þ ¼ 1:66, p ¼ :11, but it was in the opposite
Table 1 Experiment 1: means and standard deviations for decision accuracy, number of counterfactual thoughts, number of unshared clues discussed, and number of shared clues discussed across conditions Experimental Condition
Percent correct decisions (group accuracy) Number of counterfactual thoughts Number of unshared clues discussed Number of shared clues discussed Confidence
Counterfactual prime
Non-counterfactual prime
67%a
23%b
1.47a (.79)
0.62b (.64)
4.1a (1.2)
3.2b (1.1)
3.1a (1.1)
3.5a (1.3)
5.4a (.95)
5.8a (.94)
Note. Standard deviations are in parentheses. Within each row, means with different superscripts differ from each other at p < :05.
direction from what the alternative explanation would predict. Groups with a non-counterfactual prime ðM ¼ 5:5Þ actually wrote more thoughts than did groups with a counterfactual prime ðM ¼ 4:5Þ. Pre-discussion individual decisions The number of people in each condition who selected each of the four suspects before the group discussions was submitted to a v2 analysis. There were no differences between conditions, v2 ðdf ¼ 3; n ¼ 81Þ < 1, ns. There were also no differences between conditions in the number of participants who selected the correct suspect, v2 ðdf ¼ 1; n ¼ 81Þ < 1, ns. Forty-one percent of the participants correctly identified Suspect E as the murderer before discussing the case in groups. Group decisions Groups that read the counterfactual story were significantly more likely to select the correct suspect ðM ¼ 66%Þ than were groups that read the non-counterfactual story ðM ¼ 23%Þ, v2 ðdf ¼ 1; n ¼ 28Þ ¼ 5:32, p ¼ :05. The typical solution rate for groups is about 30% (Stasser & Stewart, 1992; Stewart et al., 1998) and a comparison of this baseline percentage to the counterfactual condition using a v2 test was significant, v2 ðdf ¼ 1; n ¼ 15Þ ¼ 9:60, p ¼ :01, suggesting that activating a counterfactual mind-set was indeed helpful. Discussion of unshared and shared clues We classified a clue as discussed only when all three group members claimed that the clue was explicitly referred to during their discussion. We chose this conservative method for measuring whether a group discussed a clue because we wanted to be sure about which clues were indeed mentioned. The number of shared and unshared clues discussed by groups were submitted to a 2 (Condition: counterfactual prime vs. no counterfactual prime) 2 (Type of Clue: shared vs. unshared) mixed
A.D. Galinsky, L.J. Kray / Journal of Experimental Social Psychology 40 (2004) 606–618
ANOVA with repeated measures on the second factor. The only significant effect was the Condition Type of Clue interaction, F ð1; 26Þ ¼ 7:06, p ¼ :01. As we expected, groups in the counterfactual prime condition ðM ¼ 4:1Þ discussed significantly more unshared clues than did groups in the non-counterfactual prime condition ðM ¼ 3:2Þ, tð26Þ ¼ 2:28, p < :05. For shared clues, the reverse was true—groups in the non-counterfactual prime condition discussed more shared clues ðM ¼ 3:5Þ than did groups in the counterfactual prime condition ðM ¼ 3:1Þ, although this difference was not significant, tð26Þ ¼ 1:03, p ¼ :31. The significant interaction effect, and the absence of a significant main effect for condition, show that counterfactual primes did have a helpful effect on information sharing. The counterfactual prime did not increase the discussion of all clues, just those clues that were initially unshared. The level of confidence that individual participants reported after their group discussions did not differ across conditions, tð26Þ < 1. We conducted additional analyses to determine the relationships among counterfactual thinking, the discussion of unshared clues, and the accuracy of group decisions (coded as 1 for choosing the correct suspect and 0 for choosing the incorrect suspect). The number of counterfactual thoughts that groups listed was correlated significantly with both the number of unshared clues that they discussed, rð28Þ ¼ :43, p < :05, and with their decision accuracy, rð28Þ ¼ :58, p < :01. The number of unshared clues that groups discussed was also correlated with decision accuracy, rð28Þ ¼ :37, p ¼ :05. Using logistic regression, we found that each additional unshared clue a group discussed increased a groupÕs chances of solving the murder mystery by 2.23 times. The number of shared clues that groups discussed did not correlate with any of the other measures. This experiment showed that the beneficial effect of counterfactual mind-sets on individual decision making extends to group decision making, helping to solve the problem of information sharing. A counterfactual prime increased the tendency of groups to discuss unshared information and ultimately improved the accuracy of their decisions. Because there was a hidden profile, group members needed to discuss all of their information to reach the correct solution. The counterfactual prime allowed the hidden profile to be exposed, moving groups from the incorrect to the correct suspect.
Experiment 2 We have suggested that thinking counterfactually in one context activates a counterfactual mind-set that produces mental simulation and increases awareness of alternatives. We believe that the content of the counterfactuals does not matter—if counterfactual thinking
611
activates a counterfactual mind-set, then the content of the catalytic counterfactual thoughts should not affect the influence of counterfactual primes on later group decision making. Researchers often classify counterfactuals according to the direction of comparison.3 Upward counterfactual thoughts occur when someone compares the real world to a better possible world. Downward counterfactual thoughts occur when someone compares the real world to a worse possible world. Our first experiment examined the effects of upward counterfactual thoughts on group decision making; participants in the counterfactual prime condition often lamented that Sue would have won the prize if only she had not moved. The effects of downward counterfactual thoughts on group decision making were examined in our second experiment. If such thoughts also improve group discussions and decisions, then that would support our claim that the processes associated with counterfactual mind-sets are independent of the counterfactual thoughts. Another reason to explore the effects of downward counterfactual thoughts on group decision making is that such thoughts may serve a different function than upward counterfactual thoughts. Roese (1994) suggested that upward counterfactual thoughts serve a preparative function, but downward counterfactual thoughts serve an affective function. In an intriguing experiment, Roese asked participants to perform an anagram task, then led them to think upward or downward counterfactual thoughts, and finally asked them to perform the anagram task again. Roese found that upward counterfactual thoughts facilitated performance on the second anagram task. Downward counterfactual thoughts increased positive affect, but they did not affect task performance. Thus, downward counterfactual thoughts can be used to make people feel better, but upward counterfactual thoughts can be used to improve their performance. This suggests that downward counterfactual thoughts might not improve group processes and decision accuracy. However, there are several possible reasons why downward counterfactual thinking did not affect task performance in RoeseÕs (1994) research. In that research, participants generated counterfactual thoughts about their initial performance on the anagram task and then performed the exact same task again. Because the tasks were the same and the counterfactual thoughts were directly relevant to the second task, the content of those thoughts may have mattered. Put another way, participantsÕ counterfactual thoughts did not serve as primes in RoeseÕs research. When counterfactual thoughts are 3 This distinction between upward and downward counterfactuals was adapted from the social comparison literature (Taylor, Buunk, & Aspinwall, 1990), where people are said to compare themselves to others who are either better or worse off.
612
A.D. Galinsky, L.J. Kray / Journal of Experimental Social Psychology 40 (2004) 606–618
unrelated to task performance, and thus serve as primes (Galinsky & Moskowitz, 2000), we believe that their content will not matter, and only the processes associated with counterfactual thinking (mental simulation and consideration of alternatives) will affect group decision making. In the case of counterfactual primes, downward counterfactual thoughts (like upward ones) should thus have a positive effect on group decision making. Another issue that we explored in this experiment involved the emotional or motivational states associated with counterfactual thoughts. Upward counterfactuals tend to reduce peopleÕs satisfaction and produce feelings of regret (Kahneman & Miller, 1986; Markman, Gavanski, Sherman, & McMullen, 1993; Medvec, Madey, & Gilovich, 1995; Medvec & Savitsky, 1997). In contrast, downward counterfactuals tend to produce emotions ranging from joy and a sense of relief to guilt and surprise (because a negative outcome was avoided—see Medvec & Savitsky, 1997; Roese, 1994). Upward counterfactual thoughts are often generated when negative events occur, so they are associated with negative emotions and moods. Because negative emotions and moods are associated with systematic thinking (Bless, Bohner, Schwarz, & Strack, 1990; Bless, Mackie, & Schwarz, 1992), this may explain some of the benefits of counterfactual thinking that were observed in our first experiment—systematic thinking is likely to bring more unshared information into group discussions. If so, then downward counterfactual thoughts, which typically produce positive emotions, may be less beneficial. A final goal of this experiment is to see whether counterfactual mind-sets affect the meta-cognitive processes of group members. Our first experiment did not show any effects of counterfactual thinking on the confidence of group members in their decisions, suggesting that people were unaware of the benefits they were actually experiencing from such thinking. In the second experiment, we asked participants about the distribution of information among the members of their groups. Increasing a groupÕs awareness of a taskÕs general structure is important if the group will perform similar tasks in the future, especially when counterfactual mindsets might not be invoked. We expected counterfactual thoughts to promote greater awareness of ‘‘who knew what’’ within the groups, thereby improving group discussions and decisions in yet another way (see Moreland, 1999). Method Participants Participants were 63 MBA students enrolled in a course on groups and teams. The classes were comprised of approximately 70% men. The group decision task was part of a classroom exercise conducted during the third
week of the (5-week) course. Each group contained three persons who were working together for the first time. Students were randomly assigned to groups, and groups were randomly assigned to conditions. There were 10 groups in the counterfactual prime condition and 11 groups in the non-counterfactual prime condition. Decision task We used the same decision task described earlier. Counterfactual prime We altered the stories used in Experiment 1. First, in both stories Sue won, rather than lost, the prize. Second, the new counterfactual story, which was based on a scenario from Galinsky and Moskowitz (2000), was altered to produce downward rather than upward counterfactual thoughts. In this story, after Sue switched seats, the emcee announced that the person sitting in SueÕs new seat was the winner. In the non-counterfactual story, Sue did not switch seats at all, and her current seat made her the winner. Group members again read their ‘‘team building’’ story together and then were asked as a group to ‘‘list some thoughts running through SueÕs mind.’’ Procedure The procedure paralleled that of Experiment 1 in every respect. Participants were given 20 min in the classroom to read their booklets and take notes and then were told to select the individual they thought was the murderer and provide a brief justification for their choice. Then they met their other group members in their assigned discussion room where they were given 5 min to complete the team-building exercise (counterfactual prime manipulation). After completing the exercise, groups had 20 min to discuss the murder case and make a group decision about which suspect they believed committed the murder. All groups used the full 20 min, so discussion time was again constant across groups. Following their discussion, participants returned to the classroom, where they completed a post-discussion questionnaire that asked them to identify and check every clue that was mentioned during their group discussions. Each of the unique clues (6) was included, as well as all 11 of the shared implicating clues. Each of the shared clues contained implicating information about one of the three main suspects. In this experiment, we used all of the implicating shared clues in order to get a more complete measure of group discussion. Participants were again asked to rate how confident they were that their groups had selected the correct suspect, using a nine-point Likert scale, with endpoints of 1 (‘‘not at all confident’’) and 9 (‘‘extremely confident’’). Each participant was also asked to rate his or her agreement with the claim that the information in every group memberÕs
A.D. Galinsky, L.J. Kray / Journal of Experimental Social Psychology 40 (2004) 606–618
packet was the same, using another nine-point Likert scale, with the endpoints of 1 (‘‘not at all’’) and 9 (‘‘strongly agree’’). After completing their questionnaires, participants were collectively debriefed through a 30 min discussion of the decision task. During the debriefing session, none of the participants seemed to be aware of the research hypotheses. Results and discussion The unit of analysis was the group for all analyses except for the analyses performed on pre-discussion individual choices, which were conducted at the individual level. Counterfactual activation As in Experiment 1, two independent coders identified the number of counterfactual thoughts that groups listed when they were asked what might be going through SueÕs mind.4 The coders were again blind to both the research hypotheses and the experimental conditions associated with the responses that they coded. As before, the coding reliability was high (intraclass correlation ¼ .87), so the counts from the two coders were averaged. As expected, groups in the counterfactual prime condition ðM ¼ 1:0Þ listed more counterfactual thoughts than did groups in the non-counterfactual prime condition ðM ¼ 0Þ5 (see Table 2). Because the mean and standard deviation in the non-counterfactual prime condition were zero, the homogeneity of variance assumption of ANOVA was violated. We therefore used a non-parametric test to see whether the number of counterfactual thoughts differed significantly by condition, and indeed it did (Mann–Whitney U ¼ 0:0, Z ¼ 4:23, p < :001). Once again, we also tested whether counterfactual primes simply increased the number of 4 We included expressions of guilt as examples of counterfactual thoughts in this experiment. By switching seats and winning the prize, Sue deprived the person who was originally in that seat of the prize. Some groups imagined that Sue felt guilty about this situation. 5 The total number of counterfactual thoughts was lower after positive events (e.g., winning the trip) than after negative events (e.g., not winning the trip, in Experiment 1). This result is consistent with work by Roese and Hur (1997) and Galinsky and Moskowitz (2000), who found main effects for both outcome valence and whether a scenario was mutable or not (see also Roese & Olson, 1996). Indeed, when we compared the construction of counterfactual thoughts in Experiments 1 and 2 by doing a 2 (Counterfactual Prime: yes vs. no) 2(SueÕs Outcome: winning vs. losing) ANOVA, we found a significant main effect for counterfactual prime, F ð1; 45Þ ¼ 37:59, p < :001, and a significant main effect for outcome, F ð1; 45Þ ¼ 8:15, p < :01, but no interaction between the variables, F < 1. Both outcome valence and mutability independently predicted the amount of counterfactual activation. That is, negative outcomes led people to spontaneously search for, construct, and mention counterfactual thoughts because they are more motivated to try and undo those outcomes. In contrast, positive non-mutable events rarely produce any counterfactual thoughts.
613
Table 2 Experiment 2: means and standard deviations for decision accuracy, number of counterfactual thoughts, number of unshared clues discussed, and number of shared clues discussed across conditions Experimental condition
Percent correct decisions (group accuracy) Number of counterfactual thoughts Number of unshared clues discussed Number of shared clues discussed Confidence Awareness of unshared information
Counterfactual prime
Non-counterfactual prime
70%a
27%b
1.0a (.28)
0.0b (0.0)
3.8a (.92)
2.6b (1.4)
4.6a (1.4)
5.4a (1.8)
5.9a (1.1) 4.2a (1.4)
5.6a (1.1) 3.1b (1.5)
Note. Standard deviations are in parentheses. Within each row, means with different superscripts differ from each other at p < :05, except for awareness of unshared information, which differs at p ¼ :09.
counterfactual thoughts by producing more thoughts overall. There was a marginally significant effect of condition on the total number of thoughts, tð19Þ ¼ 1:87, p ¼ :08, but this was again in the opposite direction from what the alternative explanation would predict. Groups with a non-counterfactual prime ðM ¼ 6:9Þ listed more thoughts than did groups with a counterfactual prime ðM ¼ 5:2Þ. Pre-discussion individual decisions The number of people in each condition who selected each of the four suspects before the group discussions was submitted to a v2 analysis. There were no differences between conditions, v2 ðdf ¼ 3; n ¼ 63Þ ¼ 1:43, ns. There were also no differences between conditions in the number of participants who selected the correct suspect, v2 ðdf ¼ 1; n ¼ 63Þ < 1, ns. As in Experiment 1, 41% of the participants correctly identified Suspect E as the murderer before discussing the case in groups. Group decisions Groups that read the counterfactual story were significantly more likely to select the correct suspect ðM ¼ 70%Þ than were groups that read the non-counterfactual story ðM ¼ 27%Þ, v2 ðdf ¼ 1; n ¼ 21Þ ¼ 3:83, p ¼ :05. Discussion of unshared and shared clues As in Experiment 1, we identified clues that all three group members claimed were explicitly mentioned during their discussion. The number of shared and unshared clues discussed by groups were again submitted to a 2 (Condition: counterfactual prime vs. no counterfactual prime) 2 (Type of Clue: shared vs. unshared) mixed ANOVA with repeated measures on the second factor. The group was the unit of analysis. The only significant
614
A.D. Galinsky, L.J. Kray / Journal of Experimental Social Psychology 40 (2004) 606–618
effect was the Condition Type of Clue interaction, F ð1; 19Þ ¼ 5:98, p < :05. As we expected, groups in the counterfactual prime condition ðM ¼ 3:8Þ discussed significantly more unshared clues than did groups in the non-counterfactual prime condition ðM ¼ 2:6Þ, tð19Þ ¼ 2:27, p ¼ :04. For shared clues, the reverse was true—groups in the non-counterfactual prime condition discussed more shared clues ðM ¼ 5:4Þ than did groups in the counterfactual prime condition ðM ¼ 4:6Þ, although this difference was not significant, tð19Þ ¼ 1:07, p ¼ :30. So, once again, counterfactual primes did not increase discussion of all clues, just the clues that were unshared. Once again, the level of confidence that individual participants reported after their group discussions did not differ across conditions, t < 1. Counterfactual primes had a marginally significant effect on group membersÕ awareness of the distribution of information. People in the counterfactual prime condition ðM ¼ 3:1Þ were less likely to agree with the claim that everyone had the same information than were people in the noncounterfactual prime condition ðM ¼ 4:2Þ, tð19Þ ¼ 1:82, p ¼ :09. So, the counterfactual prime tended to increase awareness that other group members possessed information. Please note, however, that the means for both conditions were below the midpoint of the rating scale, suggesting that the majority of participants realized that there was unshared information. We again conducted additional analyses to determine the relationships among counterfactual thinking, the discussion of unshared clues, and the accuracy of group decisions (coded as 1 for choosing the correct suspect and 0 for choosing the incorrect suspect). The number of counterfactual thoughts that groups listed was correlated significantly with both the number of unshared clues that they discussed, rð21Þ ¼ :48, p < :05, and with their decision accuracy, rð21Þ ¼ :47, p ¼ :05. The number of unshared clues discussed was correlated (using a onetailed test) with post-discussion decision accuracy, rð21Þ ¼ :39, p < :05.6 Using logistic regression, we found that each additional unshared clue a group discussed increased the odds of solving the murder mystery by 2.06 times. The number of shared clues that groups discussed did not correlate with any of the other measures. 6 We also conducted formal tests of mediation to determine whether information sharing mediated the effect of counterfactual primes on decision accuracy in both Experiments 1 and 2. That is, we tested whether the relationship between counterfactual primes and decision accuracy was significantly reduced when the discussion of unique clues was entered into the equation. Using the corrected procedure specified in Kenny, Kashy, and Bolger (1998), we did not find significant reduction in either Experiment 1 ðz ¼ 1:17; p ¼ :24Þ or Experiment 2 ðz ¼ 1:05; p ¼ :29Þ. These formal tests of mediation are low in power, so it is not surprising that they were not significant (MacKinnon, Lockwood, Hoffman, West, & Sheets, 1992). Despite the lack of evidence for mediation, however, it is clear that counterfactual primes had a powerful effect on both the discussion of unshared information and decision accuracy.
Once again, counterfactual mind-sets had beneficial effects on group discussion and decisions. Groups that listed more counterfactual thoughts later discussed more unshared information and made more accurate decisions. In addition, counterfactual thinking appeared to provide some insight into the structure of the task, as evidenced by the increased awareness among group members that some of them possessed unshared information. Although Experiments 1 and 2 were run in different academic semesters, we nonetheless tested for whether direction of the counterfactual prime moderated the effect of counterfactual primes on information sharing and decision accuracy. We submitted counterfactual activation, discussion of shared and unshared clues, and decision accuracy to separate 2 (Counterfactual Prime: yes vs. no) 2(SueÕs Outcome: winning vs. losing) ANOVAs and found all F s < 1 for the interaction effect. Counterfactual primes improved group performance even though the counterfactual thoughts generated were downward rather than upward, supporting our claim that the benefits of counterfactual mind-sets are independent of the content of the counterfactual thoughts and the emotions associated with those thoughts.
General discussion Across two experiments, counterfactual mind-sets had beneficial effects on group decision making. Counterfactual primes in one context led to the spontaneous discovery and use of available but unshared information in a subsequent context, and ultimately led to more accurate group decisions. The effects of counterfactual primes occurred regardless of the direction of the counterfactual thoughts (upward vs. downward). Those findings strongly suggest that it was the process of thinking counterfactually, and not the content of the counterfactual thoughts or any concomitant emotional reactions, that aided group decision making. The effects of counterfactual mind-sets on group decisions were primarily due to the increased accessibility of thoughts about alternate states of reality. The number of counterfactual thoughts groups generated was positively related to the amount of unshared information that they discussed, and ultimately to decision accuracy. In previous studies using counterfactual primes (Galinsky & Moskowitz, 2000; Galinsky, Moskowitz, & Skurnik, 2000), the role of the amount of counterfactual thinking on subsequent decision making was never investigated. Weakening the information-sharing bias This research contributes to the literature on information sharing in groups by showing that a subtle
A.D. Galinsky, L.J. Kray / Journal of Experimental Social Psychology 40 (2004) 606–618
change in cognitive orientation can weaken the widespread bias toward shared information. We found that counterfactual mind-sets improved group decision making by providing a cognitive mechanism that led to mental simulation and a greater consideration of alternatives. These findings are remarkable given previous research showing that structurally altering decision contexts does not always have a positive impact on the accuracy of group decisions. Many structural tactics either have no impact or actually worsen the information sharing bias in groups. Other tactics, however, that can help groups discuss more unshared information (and make better decisions) have been developed, such as providing explicit instructions about how to evaluate decision alternatives or assigning clear informational roles to group members. For example, Larson et al. (1994) showed that training group members to properly disseminate information decreased their focus on shared information, and Hollingshead (1996) showed that asking participants to rank alternatives rather than choose the best one led to more thorough processing of information and better decisions. Assigning a group member the role of devilÕs advocate, whose task is to challenge other membersÕ ideas, has also been shown to improve group decision making (Cosier, 1978; Janis & Mann, 1977; Valacich & Schwenk, 1995). A ‘‘consider the opposite strategy’’ improves individual decisions (Lord, Lepper, & Preston, 1984; Mussweiler, Strack, & Pfeiffer, 2000) by consciously redirecting attention to alternative states of reality. But in our experiments, no explicit training or role assignment or instructions were required. Only the subtle activation of a counterfactual mind-set was needed. The effects of counterfactual mind-sets Why do counterfactual mind-sets affect subsequent, unrelated judgments? Why would attention to alternatives endure and extend to the sharing of information or the testing of hypotheses (see Galinsky & Moskowitz, 2000; Kray & Galinsky, 2003)? Generally, the activation of a particular cognitive orientation in a prior context can influence subsequent information processing strategies (Chen, Shechter, & Chaiken, 1996; Galinsky, Gruenfeld, & Magee, 2003; Gollwitzer, Heckhausen, & Steller, 1990). Research and theorizing by Gollwitzer et al. (1990) are particularly instructive on this point— deliberating about goals for one task leads to deliberative tendencies for other, unrelated tasks because deliberation is a functional, well-learned strategy for approaching the world. Roese (1994) points out that counterfactual thinking, like deliberative thinking, is a pervasive feature of mental life. Its ubiquity may stem from its functionality for performing goal-directed behavior. Once activated, a counterfactual mind-set per-
615
sists because it is a well-learned and useful strategy for comprehending the world. Although we used specific stories to activate counterfactual thoughts, the content and direction of those thoughts did not appear to be important (Galinsky & Moskowitz, 2000; Galinsky et al., 2000). Given that upward and downward counterfactuals differ in their content (better vs. worse possible worlds) and accompanying emotional reactions, one way of showing that counterfactual mind-sets affect how people think and not just what they think is to demonstrate that upward and downward counterfactuals have a similar impact on decision making. Across the two experiments, we found similar benefits of upward counterfactual thoughts (Experiment 1) and downward counterfactual thoughts (Experiment 2) for group decision making. Limitations and future research Our research provides clear and compelling evidence that counterfactual mind-sets can improve group decisions. There are a number of possible avenues, however, for future research. One limitation of our research was that it did not systematically compare the level (individual vs. group) at which counterfactual thoughts were activated or decisions were made. Is it necessary for groups to construct counterfactual thoughts, or is it sufficient for individual group members to think counterfactually before a group discussion to improve decision accuracy? The level at which counterfactual thoughts are activated is related to questions about the processes that underlie the effect of counterfactual primes. We have suggested that groups, like individuals (Galinsky & Moskowitz, 2000), respond to a counterfactual prime by engaging in mental simulation and considering multiple alternatives, thereby increasing the probability that unshared but relevant information will be discussed. But maybe counterfactual primes are beneficial to groups because they create or change group norms. Creating group norms in one context can have systematic effects on later information sharing by groups. Postmes et al. (2001), for example, found that creating a norm that encouraged critical thinking led to greater sharing of information than a norm that emphasized cohesion. These norms only affected group decisions, not decisions subsequently made by individual members. One way to clarify why counterfactual thinking is beneficial for groups would be to prime counterfactual thoughts in each group member, rather than having the group collectively construct counterfactual thoughts. Or, as in Postmes et al, one could have groups construct counterfactual thoughts and see if that subsequently affects individual decisions. Some recent evidence by Liljenquist, Galinsky, and Kray (in press) suggests that the level (group or individual) at which counterfactual
616
A.D. Galinsky, L.J. Kray / Journal of Experimental Social Psychology 40 (2004) 606–618
thoughts are activated is of critical importance. Whereas having group members construct counterfactual thoughts together leads to increased coordination among group members, activating counterfactual thoughts separately in each group member can have a debilitating effect on group performance by impairing the ability of group members to coordinate their behavior. Activating counterfactuals separately in each group member led participants to become trapped in their own personal mind-sets, isolated in their independent counterfactual conjectures. Another avenue for future research is to determine whether the self-relevance of counterfactual thoughts moderates their impact on decision making. In our experiments, counterfactual thoughts were always about another person, a mere character in a story. Self-relevant counterfactuals may do just as well at producing mental simulations and the consideration of alternatives. However, they could also lead to ruminations that impair the ability to focus on the task at hand (Sherman & McConnell, 1995). Self-relevant thoughts that are upward versus downward may also produce divergent effects, given that rumination and recriminations are likely to follow from self-relevant upward counterfactual thinking but not self-relevant downward counterfactual thinking (Sherman & McConnell, 1995). Yet another avenue for future research is to expand the types of tasks that people are asked to perform. Like the murder mystery task used in our experiments, virtually all of the tasks that have been used to examine the effects of counterfactual thinking by Galinsky and colleagues (Galinsky & Moskowitz, 2000; Galinsky et al., 2000; Kray & Galinsky, 2003) are convergent in nature—they had one right answer. Would the benefits of counterfactual mind-sets extend to divergent tasks (Anastasi, 1982), which require dispersed, as opposed to focused, attention, and an open-minded, as opposed to critical, attitude toward new ideas? The type of focus that derives from counterfactual thinking appears to impair performance on divergent tasks (Kray & Galinsky, 2004). Comparing the effect of counterfactual primes on convergent versus divergent tasks might also clarify the underlying processes that are associated with counterfactual mind-sets. We must also note that in our research, the decisionmaking processes of groups were evaluated retrospectively. Evidence that groups discussed unshared versus shared clues was obtained by asking group members to recall what happened, rather than observing them while they discussed the murder case. One benefit of this methodology is that groups made their decisions in a naturalistic environment, without feeling as though they were under a researcherÕs microscope. This is potentially important, given that accountability can strengthen the bias toward shared information (Stewart et al., 1998). Nevertheless, relying on what group members recalled
about their discussions is problematic in several ways. For example, such recall could be influenced not only by whether various clues were discussed, but also by how consistent each clue was with the groupÕs final decision. We are confident, however, in the validity of our measure. We explicitly instructed participants to only list clues that were actually discussed. This was stressed both orally and in writing. In addition, we used a conservative criterion for deciding whether a clue was discussed—all three group members had to list the clue. The fact that there was a significant positive correlation in both experiments between the number of unshared clues that groups discussed and their levels of counterfactual thinking lends further support to our claim of validity. If participants simply selected clues that were consistent with their decisions, but were not actually discussed, then we would not expect to see this correlation. Still, future research that involves (unobtrusive) recording and coding of the content of group discussions could provide a more thorough understanding of how and why group processes are affected by counterfactual mind-sets (see Liljenquist et al., 2003). And although the similar results that we found across Experiments 1 and 2 suggests that counterfactual primes exert their influence through a mental simulation mind-set and not their concomitant emotional reactions, future research should include more direct measures of mood. Exploring the direct effects that mood and counterfactual emotions have on group discussions seems worthwhile. Finally, is priming counterfactuals a panacea for weakening decision-making biases and improving decision accuracy in groups? We believe that it is not—a counterfactual mind-set may not always produce beneficial results. Galinsky and Moskowitz (2000) suggested that counterfactual mind-sets both can help and harm thought and action, depending on the nature of a task (see also Galinsky, Liljenquist, Kray, & Roese, in press; Galinsky et al., 2002). Counterfactual primes can lead to the discovery of hidden solutions, as in the present experiments, but they can also increase the prevalence of certain reasoning errors. For example, Galinsky and Moskowitz (2000) found that counterfactual mind-sets can lead individuals to consider more alternatives than are appropriate. Group tasks that require decision makers to discriminate between useful and misleading information may thus be vulnerable to the harmful effects of counterfactual mind-sets. For example, counterfactual mind-sets may exacerbate the ‘‘dilution effect,’’ which occurs when the impact of diagnostic information in judgment tasks is diluted due to the presence of nondiagnostic information (Nisbett, Zukier, & Lemley, 1981; Zukier, 1982). Although counterfactual thinking improved the decision accuracy of groups in our research, it is important to explore other contexts in which counterfactual thinking might lead to less accurate group decisions.
A.D. Galinsky, L.J. Kray / Journal of Experimental Social Psychology 40 (2004) 606–618
Overall, the findings presented in this paper paint an optimistic picture of the effects that counterfactual mind-sets can have on group decision making. Activating a cognitive mind-set that makes thoughts about alternate realities salient serves as a useful reminder for groups to share and consider all of the information that their members possess. Thinking about what might have been can enable group members to share all they know, and thereby improve the accuracy of their group decisions.
References Anastasi, A. (1982). Psychological testing. New York, NY: Macmillan. Bless, H., Bohner, G., Schwarz, N., & Strack, F. (1990). Mood and persuasion: A cognitive response analysis. Personality and Social Psychology Bulletin, 16, 331–345. Bless, H., Mackie, D. M., & Schwarz, N. (1992). Mood effects on encoding and judgmental processes in persuasion. Journal of Personality and Social Psychology, 63, 585–595. Chen, S., Shechter, D., & Chaiken, S. (1996). Getting at the truth or getting along: Accuracy- and impression motivated heuristic and systematic processing. Journal of Personality and Social Psychology, 71, 262–275. Cosier, R. A. (1978). The effects of three potential aids for making strategic decisions on prediction accuracy. Organizational Behavior and Human Decision Processes, 22, 295–306. Galinsky, A. D., Gruenfeld, D. H., & Magee, J. C. (2003). From power to action. Journal of Personality and Social Psychology, 85, 453– 466. Galinsky, A. D., Liljenquist, K. A., Kray, L. J., & Roese, N. R. (in press). Meaning through mutability: The sensemaking role of counterfactual thinking. In: D. R. Mandel, D. J. Hilton, & P. Catellani (Eds.), The psychology of counterfactual thinking. London: Routledge. Galinsky, A. D., & Moskowitz, G. B. (2000). Counterfactuals as behavioral primes: Priming the simulation heuristic and consideration of alternatives. Journal of Experimental Social Psychology, 36, 384–409. Galinsky, A. D., Moskowitz, G. B., & Skurnik, I. (2000). Counterfactuals as self-generated primes: The effect of prior counterfactual activation on person perception judgments. Social Cognition, 18, 252–280. Galinsky, A. D., Seiden, V., Kim, P. H., & Medvec, V. H. (2002). The dissatisfaction of having your first offer accepted: The role of counterfactual thinking in negotiations. Personality and Social Psychology Bulletin, 28, 271–283. Gilbert, D. T. (1989). Thinking lightly about others: Automatic components of the social inference process. In J. S. Uleman & J. A. Bargh (Eds.), Unintended thought (pp. 189–211). New York: Guilford. Gollwitzer, P. M., Heckhausen, H., & Steller, B. (1990). Deliberative vs. implemental mind-sets: Cognitive tuning toward congruous thoughts and information. Journal of Personality and Social Psychology, 59, 1119–1127. Hirt, E. R., & Markman, K. D. (1995). Multiple explanation: A consider-an-alternative strategy for debiasing judgments. Journal of Personality and Social Psychology, 69, 1069–1086. Hollingshead, A. B. (1996). The rank-order effect in group decision making. Organizational Behavior and Human Decision Processes, 68, 181–193. Janis, I. L. (1982). Groupthink (2nd ed., rev.). Boston: Houghton Mifflin.
617
Janis, I. L., & Mann, L. (1977). Emergency decision making: A theoretical analysis of responses to disaster warnings. Journal of Human Stress, 3, 35–48. Kahneman, D., & Miller, D. T. (1986). Norm theory: Comparing reality to its alternatives. Psychological Review, 93, 136–153. Kahneman, D., & Tversky, A. (1982). The simulation heuristic. In D. Kahneman, P. Slovic, & A. Tversky (Eds.), Judgment under uncertainty: Heuristics and biases (pp. 201–208). New York: Cambridge University Press. Kahneman, D., & Varey, C. A. (1990). Propensity stand counterfactuals: The loser that almost won. Journal of Personality and Social Psychology, 59, 1101–1110. Kenny, D. A., Kashy, D. A., & Bolger, N. (1998). Data analysis in social psychology. In D. Gilbert, S. Fiske, & G. Lindzey (Eds.), The handbook of social psychology (Vol. 1, (4th ed., pp. 233–265). Boston: McGraw-Hill. Kray, L. J., & Galinsky, A. D. (2003). The debiasing effect of counterfactual mind-sets: Increasing the search for disconfirmatory information in group decisions. Organizational Behavior and Human Decision Processes, 91, 69–81. Kray, L. J., & Galinsky, A. D. (2004). Thinking within the box: The differential effect of counterfactual mind-sets on divergent and convergent thinking (submitted for publication). Larson, J. R., Foster-Fishman, P. G., & Keys, C. B. (1994). Discussion of shared and unshared information in decision-making groups. Journal of Personality and Social Psychology, 67, 446–461. Liljenquist, K. A., Galinsky, A. D., & Kray, L. J. (in press). Exploring the rabbit hole of possibilities by myself or with my group: The benefits and liabilities of activating counterfactual mind-sets for information sharing and group coordination. Journal of Behavioral Decision Making. Lord, C. G., Lepper, M. R., & Preston, E. (1984). Considering the opposite: A corrective strategy for social judgment. Journal of Personality and Social Psychology, 47, 1231–1243. MacKinnon, D. P., Lockwood, C. M., Hoffman, J. M., West, S. G., & Sheets, V. (1992). A comparison of methods to test mediation and other intervening variable effects. Psychological Methods, 7, 83–104. Martin, L. L. (1986). Set/reset: Use and disuse of concepts in impression formation. Journal of Personality and Social Psychology, 51, 493–504. Markman, K. D., Gavanski, I., Sherman, S. J., & McMullen, M. N. (1993). The mental simulation of better and worse possible worlds. Journal of Experimental Social Psychology, 29, 87–109. Medvec, V. H., Madey, S., & Gilovich, T. D. (1995). When less is more: Counterfactual thinking among Olympic Medalists. Journal of Personality and Social Psychology, 69, 603–610. Medvec, V. H., & Savitsky, K. (1997). When doing better means feeling worse: The effects of categorical cutoff points on counterfactual thinking and satisfaction. Journal of Personality and Social Psychology, 72, 1284–1296. Miller, D. T., & McFarland, C. (1986). Counterfactual thinking and victim compensation: A test and norm theory. Personality and Social Psychology Bulletin, 12, 513–519. Moreland, R. L. (1999). Transactive memory: Learning who knows what in work groups and organizations. In L. Thompson, D. Messick, & J. Levine (Eds.), Shared cognition in organizations: The management of knowledge (pp. 3–31). Mahwah, NJ: Erlbaum. Mussweiler, T., Strack, F., & Pfeiffer, T. (2000). Overcoming the inevitable anchoring effect: Considering the opposite compensates for selective accessibility. Personality and Social Psychology Bulletin, 26, 1142–1150. Nisbett, R. E., Zukier, H., & Lemley, R. (1981). The dilution effect: Nondiagnostic information weakens the implications of diagnostic information. Cognitive Psychology, 13, 248–277. Postmes, T., Spears, R., & Cihangir, S. (2001). Quality of decision making and group norms. Journal of Personality and Social Psychology, 80, 918–930.
618
A.D. Galinsky, L.J. Kray / Journal of Experimental Social Psychology 40 (2004) 606–618
Pyszczynski, T., & Greenberg, J. (1987). Toward an integration of cognitive and motivational perspectives on social inference: A biased hypothesis-testing model. In L. Berkowitz (Ed.), Advances in experimental social psychology (Vol. 20, pp. 297–340). New York: Academic Press. Presidential commission on the space shuttle Challenger accident (1986). Report to the President. Washington DC: The Commission. Roese, N. J. (1994). The functional basis of counterfactual thinking. Journal of Personality and Social Psychology, 66, 805–818. Roese, N. J., & Hur, T. (1997). Affective determinants of counterfactual thinking. Social Cognition, 15, 274–290. Roese, N. J., & Olson, J. M. (1996). Counterfactuals, causal attributions, and the hindsight bias: A conceptual integration. Journal of Experimental Social Psychology, 32, 197–227. Schulz-Hardt, S., Frey, D., Luthgens, C., & Moscovici, S. (2000). Biased information search in group decision making. Journal of Personality and Social Psychology, 78, 655–669. Sherman, S. J., & McConnell, A. R. (1995). Dysfunctional implications of counterfactual thinking: When alternatives to reality fail us. In N. J. Roese & J. M. Olson (Eds.), What might have been: The social psychology of counterfactual thinking (pp. 199–231). Hillsdale, NJ: Erlbaum. Snyder, M., & Swann, W. B. (1978). Hypotheses testing processes in social interaction. Journal of Personality and Social Psychology, 36, 1202–1212. Stasser, G. (1999). The uncertain role of unshared information in collective choice. In L. Thompson, J. Levine, & D. Messick (Eds.), Shared knowledge in organizations (pp. 49–69). Hillsdale, NJ: Erlbaum. Stasser, G., & Stewart, D. (1992). Discovery of hidden profiles by decision making groups: Solving a problem versus making a judgment. Journal of Personality and Social Psychology, 63, 426–434. Stasser, G., Stewart, D. D., & Wittenbaum, G. M. (1995). Expert roles and information exchange during discussion: The importance of knowing who knows what. Journal of Experimental Social Psychology, 31, 244–265.
Stasser, G., Taylor, L. A., & Hanna, C. (1989). Information sampling in structured and unstructured discussions of three- and sixperson groups. Journal of Personality and Social Psychology, 57, 67–78. Stasser, G., & Titus, W. (1985). Pooling of unshared information in group decision making: Biased information sampling during discussion. Journal of Personality and Social Psychology, 48, 1467–1478. Stewart, D. D., Billings, R. S., & Stasser, G. (1998). Accountability and the discussion of unshared, critical information in decision making groups. Group Dynamics, 2, 18–23. Taylor, S. E., Buunk, B. P., & Aspinwall, L. G. (1990). Social comparison, stress, and coping. Personality and Social Psychology Bulletin, 16, 74–89. Tetlock, P. E. (1992). The impact of accountability on judgment and choice: Toward a social contingency model. In M. Zanna (Ed.), Advances in experimental social psychology (Vol. 25, pp. 331–376). New York: Academic Press. Trope, Y., & Liberman, A. (1996). Social hypothesis testing: Cognitive and motivational mechanisms. In E. T. Higgins & A. W. Kruglanski (Eds.), Social psychology: Handbook of basic principles (pp. 239–270). New York, NY: Guilford Press. Tversky, A., & Kahneman, D. (1973). Availability: A heuristic for judging frequency and probability. Cognitive Psychology, 5, 207– 232. Valacich, J. S., & Schwenk, C. (1995). DevilÕs advocate and dialectical inquiry effects on face-to-face and computer-mediated group decision making. Organizational Behavior and Human Decision Processes, 63, 158–173. Winquist, J. R., & Larson, J. R. (1998). Information pooling: When it impacts group decision making. Journal of Personality and Social Psychology, 74, 371–377. Zukier, H. (1982). The dilution effect: The role of the correlation and the dispersion of predictor variables in the use of nondiagnostic information. Journal of Personality and Social Psychology, 43, 1163–1174.