Brain Research Bulletin 67 (2005) 438–442
Reasoning or reacting to others? How consumers use the rationality of other consumers Linda Pelzmann, Urska Hudnik ∗ , Michaela Miklautz University of Klagenfurt, Universitaetsstrasse 65–67, 9020 Klagenfurt, Austria Available online 5 July 2005
Abstract Consumers adapt their behavior to the structure of the information available in the environment where they form expectations. One factor in people’s environments is other people. “Others” are a significant source of information and means of orientating oneself. In conditions such as uncertainty, insecurity, anxiety but also euphoria, it is “the others” who provide market participants with coordinates. The purpose of this paper is to pass forward an approach that relates to consumers’ other-directedness and to demonstrate in what ways consumers use the rationality of other consumers. © 2005 Elsevier Inc. All rights reserved. Keywords: Consumers; Neuroeconomics; Markets
1. Background: ecological and collective rationality When the market panics, market participants are not panicking individually, separately or in isolation, but rather in response to the panicking of others. The same is true of buying and risk perception. When the market is booming, most market participants are not buying separately, in isolation, but in response to the buying of others. The same applies to risk perception. In highly volatile markets, market participants do not perceive risks separately, but in response to others. In highly charged social settings, when sentiment dominates, market participants do what others are doing, buy what others are buying, without thinking or deciding for themselves; consumers’ behavior depends on how many others are expected to behave in a particular way. Economic theories of decision-making and rational choice models have neglected emotions and sometimes even cast them as the very opposite of rationality. However, emotions guide and speed decision-making: disgust can limit the choice set of potential foods and help to avoid poisoned food. Social rules such as “eat what others eat”, or “prefer what is pre∗
Corresponding author. Tel.: +43 699 12 16 17 63; fax: +43 1 31 06 917. E-mail address: urska
[email protected] (U. Hudnik).
0361-9230/$ – see front matter © 2005 Elsevier Inc. All rights reserved. doi:10.1016/j.brainresbull.2005.06.007
ferred by others” guide behavior without much information gathering and bring benefits and a reduced likelihood of risks. These forms of social rationality can be found throughout the animal world. Communities of social insects are one example of collective rationality. Honeybees, for instance, make intelligent collective decisions about where to build a new hive that seems to emerge from individual bees’ application of a few simple, well-adapted rules. In the consumer world too, emotions, social norms, customs and crowd behavior can function as decision-making guidelines that keep information-search to a minimum. Much of life in the economic world is governed by social rules, social constraints and custom—not by goal driven optimization. How then do real consumers make decisions? The key to understanding is comprehending how human decisionmaking strategies and consumer’s choices are well matched to particular task environments [4–6]. Two related concepts, ecological rationality and social rationality, help to expand on Simon’s [15] model of bounded rationality. Ecological rationality focuses on the way that humans and their repertoire of simple decision strategies are adapted to specific environments. Social rationality captures the fact that our fellow humans form a special part of our environment. The ecologi-
L. Pelzmann et al. / Brain Research Bulletin 67 (2005) 438–442
cal, bounded, and social rationality perspectives are different starting points that converge in the hypothesis of cumulative rationality, outlined in this paper. The purpose of our project is to understand human rationality as it is adapted to specific environments via the different rules that guide adaptive behavior. It was Nobel Laureate Friedrich von Hayek who defined the degree of civilization as the extent to which we benefit from knowledge we do not ourselves possess “. . . It is indeed not so much the greater knowledge that the individual can acquire, as the greater benefit he receives from the knowledge possessed by others, which is the cause of his ability to pursue an infinitely wider range of ends . . ..” Hayek [9] summarized the conclusions in a chapter on Reason and Evolution in Rules and Order, Volume 1 of Law, Legislation and Liberty: “Man is as much a rule-following animal as a purpose-seeking one. His actions are not simply directed towards ends; they also conform to social standards and conventions, and unlike a calculating machine he acts because of his knowledge of rules and objectives. And he is successful not because he knows why he ought to observe the rules, which he does observe . . ., but because his thinking and acting are governed by rules, which have by a process of selection been evolved in the society, in which he lives . . .. Man’s actions are largely successful . . . because they are adapted both to the particular facts, which he knows, and to a great many other facts he does not and cannot know. And this adaptation to the general circumstances that surround him is brought about by his observance of rules, which he has not designed and often does not even know explicitly, although he is able to honor them in action. . . . These rules of conduct have thus not developed as the recognized conditions for the achievement of a known purpose, but have evolved because the groups who practiced them were more successful and displaced others.” Hayek led us to new principles of rationality. What counts as a reason, originally, may have a basis in evolutionary selection. The evolutionary account of reasons reverses the direction of Immanuel Kant’s Copernican Revolution. Rationality is now considered as an evolutionary adaptation with a purpose and function. Rationality evolved as an adaptation against a background of stable facts that it was selected to work with in tandem. One such fact is the presence of others with a similarly evolving rationality. If rationality evolved alongside the concurrent rationality of others, then each person’s rationality may have a character that fits it to work in tandem with that similar rationality of others [13], regardless of whether or not people could rationally demonstrate these facts.
2. In what ways does our rationality use the rationality of other market participants? We are predisposed to learn language with others and also to learn values and orientation from others. In emulating oth-
439
ers, we seem to presume that they are rational. A study by Pelzmann [14] on customer’s choices and risk perception in a deregulated telecom market demonstrates how customers use the rationality of other customers. 2.1. Samples In 1998, the Austrian telecom market was opened up to competition and 11 other service providers gradually entered the market. The University of Graz was the first to change the provider. A rumor was going around that the move to UTA (an Austrian Telecom Service provider) could reduce the university’s telephone costs by a third and that this had been confirmed by the first telephone bill received after the changeover. The University of Klagenfurt, however, had not changed to another provider. Pelzmann was monitoring systematically whether the change would act as a catalyst, which would prompt university employees to change their service provider for their home phone line. All the economic conditions were the same in both locations. Significant differences in the behavior of the two samples of 100 employees should be attributed to the impact of the catalyst and to the impact of customers who used the rationality of other customers. 2.2. Results Data shows that consumers who entered into new territory and had no experience of the benefits and risks involved orientated themselves in relation to other people. In what ways did customers use the rationality of others? What sort of risk-related information did they obtain from others? Firstly, they found out whether other people had had problems, e.g. interruptions; secondly, whether the promised benefit actually occurred, that is, whether it was worth making the change. Consumers observed what was happening to others, the consequences. The experience of other people was used to identify the absence of failures and risks. After observing that others did not suffer from any negative consequences, people’s doubts and constraints were annulled and the way was cleared for a psychological chain reaction to take place. This impact accelerated shifts from one-third to two-thirds in the Graz sample, whereas in the Klagenfurt sample, without a catalyst, other-directedness did not gain momentum.
3. How are perceptions related to actual fundamental risk? Unlike the traditional definition of fundamental risk, perceived risk does not come from the news concerning poor performance. Perception has not much to do with the reality of fundamental risk. Perceived risk is triggered by other market participants, generating a feedback-loop. Consumers perceive risk when others are suffering from severe losses, bankruptcy, etc. Observing the consequences of other people’s choices and actions provides information that allows
440
L. Pelzmann et al. / Brain Research Bulletin 67 (2005) 438–442
people to avoid exposing themselves to risk. The choices relate directly to other people’s choices.
4. Research designs: System 1, System 2 and System 3 Dual-process theories of cognition [3,10,11] distinguish between two families of cognitive operations, two thinking systems, labeled System 1 and System 2, which actually have two very different characteristics. Kahneman calls them Intuition and Reasoning. The operations of the intuitive System 1 are fast, automatic, effortless, associative, and intuitive. They are governed by habit, so they are difficult either to modify or to control. System 1 is not very educable, it is a fast and frugal short cut, processes rapid reactions. System 1 often has an affective component, but it need not. System 2 is the reasoning System. It is calculative and deductive, reflective and self-aware, serial, effortful, and deliberately controlled. System 2 is slow and laborious, it operates as a monitor, confirming and overriding System 1 judgments and reactions. The difference in effort provides the most useful indicator of whether a given mental process should be assigned to System 1 or System 2. System 3 uses the rationality of others. There is also the issue of perception, which links to other-directedness. Perception evolved differently than either intuition or cognition evolved. Pelzmann labeled the impact of other’s choices on what people are choosing as System 3. According to Damasio [3], rational decision-making uses two complementary paths. Confronted with a situation that requires a response, path A (Reasoning System 2) prompts images related to the situation, the options for action, and the anticipation of future outcomes. Reasoning strategies can operate on that knowledge to produce a decision. Path B (Sys-
tem 1) operates in parallel and prompts activation of prior experiences in comparable situations. In turn, the recall of the emotionally related material, be it covert or overt, influences the decision-making process by forcing attention on the representation of future outcomes or interfering with reasoning strategies. On occasion, path B (Reacting System 1) can lead directly to an immediate response (Fig. 1). System 3 operates in parallel, using the rationality of others. Studies by Pelzmann [14] reveal that the mind is particularly vulnerable to other-directed reactions and contagious responses in the course of excitement, euphoria, overoptimism, frenzies, and stampedes, panicking. It is in these settings and situations that crowd response substitutes for individual reasoning. These are situations and circumstances when the mind is unwilling to control itself (Fig. 2).
5. Evidence from studies in the field of neuroeconomics? Scientific evidence is shifted from verbal and behavioral data to neurophysiological data. Several fMRI studies analyzed areas of activation during the cognitive process and decision-making–dorsolateral prefrontal cortex–and also the decreased blood flow in this area, as soon as the task was learned and it didn’t require reasoning anymore. Berns et al. [1] have studied the brain regions responsive to novelty, without awareness. Participants performed a simple reaction-time task in which all stimuli were equally likely but, unknown to them, followed a complex sequence. Measures of behavioral performance indicated that participants learned the sequences even though they were unaware of the existence of any order. Once the participants were trained, a subtle and unperceived change in the nature of the sequence resulted in increased blood flow in a network comprising the left premotor area, left anterior cingulate and right ventral striatum.
Fig. 1. Inside a Decision-Making Mechanism (Source: ref. [3], p. 149).
L. Pelzmann et al. / Brain Research Bulletin 67 (2005) 438–442
441
Fig. 2. Reacting to others, a fast and frugal short-cut.
Blood flow decreases were observed in the right dorsolateral prefrontal (DLPFC) and parietal areas. This suggests that the participants learned the sequences (although without awareness) and therefore, did not use the cognitive processes any longer, or less active. This has been shown in blood flow decrease in the DLPFC area, which is usually responsible for decision-making and cognitive processes. The right prefrontal area is associated with the maintenance of contextual information-while the ventral striatum is responsive to novel information and both processes can occur without awareness. This study has shown that the blood flow increases significantly in the dorsolateral prefrontal cortex during the decision-making and/or cognitive processes. And it decreases strongly, when the cognitive process is not as active anymore (such as the situation, when the participants learned the sequences and are not forced to think long and intensive anymore). Further, the important role of emotions was proved in increased activation of the anterior insula. Historically, psychologists have disagreed about whether moral judgments are primarily products of emotional or of reasoning and higher cognition. Recently, findings from several areas of cognitive neuroscience have begun to converge on an answer: that emotions and reasoning both matter. In a suggestive experiment, designed to see how the human brain responds to moral dilemmas, Greene et al. [8], Greene [7] studied typical moral-personal dilemmas (such as a version of the footbridge dilemma, a case of stealing one person’s organs in order to distribute them to five others) and typical moral-impersonal dilemmas (such as a version of trolley dilemma, a case of keeping money found in a lost wallet). The authors found that different brain areas were activated in personal versus impersonal dilemmas, “that there are systematic variations in the engagement of emotions in moral judgment”, and that brain areas associated with emotion are far more active in contemplating the so-called footbridge prob-
lem than in contemplating the trolley problem. Greene [7] found different patterns of mental processes when a subject personalized the moral problem or when the subject calculated the same moral problem. An implication of Greene’s finding is that human brains are hard-wired to distinguish between bringing about a death “up close and personal” and doing so at a distance. Responding to personal moral dilemmas produced increased activity in areas associated with social/emotional processing: medial frontal gyrus, posterior cingulate gyrus and superior temporal sulcus (STS). Medial frontal gyrus serves in the integration of emotions into decision-making and planning and other specifically social functions relevant to moral judgment. The posterior cingulate/retrosplenial cortex is one of the most commonly activated areas in neuroimaging studies of emotion. The posterior STS/inferior parietal region has a function of perception and representation of socially significant information, which are crucial for making inferences about the beliefs and intentions of others. By contrast, impersonal response to dilemmas and calculation produced increased activity in areas associated with working memory and classical cognitive processes: dorsolateral prefrontal cortex and parietal lobes. These activations may represent the application of domain—neutral reasoning to moral judgment. Since the dorsolateral prefrontal cortex was strongly activated, it is assumed that Reasoning plays an important role in the production of impersonal moral judgment and in personal moral judgments in which reasoned considerations and emotions are in conflict. Berns et al. [2] investigated a cooperation based on reciprocal altruism. They used Prisoner’s Dilemma Game to model this form of cooperation. They investigated the neurobiological basis of cooperative social behavior. Experiment 1 was designed to isolate the neural correlates of cooperation and non-cooperation in social (personal) and nonsocial (imper-
442
L. Pelzmann et al. / Brain Research Bulletin 67 (2005) 438–442
sonal) contexts, and of monetary reinforcement of behavior. The results of the first experiment revealed different patterns of neural activation depending on whether the playing partner was identified as a human (personal) or a computer (impersonal context). This motivated a second experiment in which 17 subjects were scanned during each of three game sessions, focusing specifically on human versus computer interaction. When subjects in experiment 2 were instructed that they were playing the game with a computer rather than another person, mutual cooperation was less common throughout the game, even though subjects were actually playing against exactly the same computer strategy. The mutual cooperation with the computer playing partner-activated regions of the ventromedial/orbital frontal cortex (OFC) that was also activated with the human playing partners. But the difference was in the following areas: mutual cooperation with a computer did not activate the rostral anterior cingulate or anteroventral striatum, while both areas were activated during human playing partners. This observed activation may be associated with positive feelings toward one’s partner; that activation of the anteroventral striatum and OFC can result in such feelings that reinforce the cooperative act, superseding any conscious recognition that material gains will flow from mutual cooperation.
6. Open problems We are working on an experimental design to isolate the neural correlates of using the other’s rationality—by imaging a stampede versus reasoning an individual alternative. We are testing, whether System 3 is activated, when people are reacting to others, using the rationality of others especially in circumstances such as uncertainty or panic. Here, the perception of others plays the significant role. Instead of individual reasoning and decision-making, the crowd response takes over the process. Using the results from the study of Kenning et al. [12] and their discovery of cortical relief, we assume that similar phenomena will happen in the brain when people are reacting to others. With the help of great science of Neuroeconomics we are to test this hypothesis. But is there a difference between reaction to other people (personal-stimuli) and reaction to non-personal-stimuli? Would different brain areas become activated when we do not deal with people, but with securities? Which brain areas become more active when we resist tempting promises, when we are casting doubts on following the crowd? Are frontal areas less active in excitement, euphoria, frenzies, when consumers are likely to do what others are doing and to buy what
others are buying, without thinking or deciding for themselves?
7. Summary Traditional Efficient Markets Hypothesis and rational choice models of decision-making emphasize reasoning (System 2). A more recent trend emphasizes the role of emotion and the heuristic nature of decision-making (System 1). Pelzmann provides behavioral data that supports a model of ecological and other-directed rationality (System 3), demonstrating, in what ways consumers use the rationality of others. As a result of the increasing abilities to explore the brain, this article calls for an other-directed approach to the study of consumer behavior, which incorporates the findings from the emerging area of Neuroeconomics. References [1] G. Berns, J. Cohen, M. Mintun, Brain regions responsive to novelty in the absence of awareness, Science 276 (1997) 1272–1275. [2] G. Berns, J. Rilling, A. Gutman, T. Zeh, G. Pagnoni, C. Kilts, A neural basis for social cooperation, Neuron 35 (2002) 395–405. [3] A. Damasio, Looking for Spinoza: Joy, Sorrow, and the Feeling Brain, Harcourt, New York, 2003. [4] G. Gigerenzer, Decision making: nonrational theories, in: N.J. Smelser, P.B. Baltes (Eds.), International Encyclopedia of the Social & Behavioral Science, vol. 5, Elsevier, Oxford, 2001, pp. 3304–3309. [5] G. Gigerenzer, R. Selten (Eds.), Bounded Rationality: The Adaptive Toolbox, MIT Press, Cambridge, MA, 2001. [6] G. Gigerenzer, P.M. Todd, Simple Heuristics that Make us Smart, Oxford University Press, New York, 1999. [7] J. Greene, Cognitive conflict and control in moral judgment, in: SABE Conference, 2004. [8] J. Greene, J. Haidt, How (and where) does moral judgment work? Trends Cogn. Sci. 6 (2002) 517–523. [9] F.A. Hayek, Rules and Order Law, Legislation and Liberty, vol. I, The University of Chicago Press, Chicago, 1973, pp. 11–18. [10] D. Kahneman, The thought leader interview, in: M. Schrage (Ed.), Strategy + Business Issues, Booz Allen Hamilton Inc., 2003. [11] D. Kahneman, S. Frederick, Representativeness revisited: attribute substitution in intuitive judgment, in: T. Gilovich, D. Griffin, D. Kahneman (Eds.), Heuristics and Biases: The Psychology of Intuitive Judgment, University Press, Cambridge, 2002. [12] P. Kenning, H. Plaßmann, M. Deppe, H. Kugel, W. Schwindt, The Discovery of Cortical Relief, Neuroeconomic Research Reports, Westf¨alische Wilhelms-Universit¨at M¨unster, 2002. [13] R. Nozick, The Nature of Rationality, Princeton University Press, Princeton, NJ, 1993, p. 178. [14] L. Pelzmann, The triumph of mass manufactured will— circumstances and rules, Malik on Management m.o.m. 10 (2002) 188–205. [15] H.A. Simon, Rational choice and the structure of environments, Psychol. Rev. 63 (1956) 129–138.