Neurobiological perspectives on the nature of visual and verbal processes

Neurobiological perspectives on the nature of visual and verbal processes

Available online at www.sciencedirect.com Journal of Consumer Psychology 18 (2008) 264 – 269 Research Dialogue Neurobiological perspectives on the ...

338KB Sizes 42 Downloads 108 Views

Available online at www.sciencedirect.com

Journal of Consumer Psychology 18 (2008) 264 – 269

Research Dialogue

Neurobiological perspectives on the nature of visual and verbal processes Terry L. Childers a,⁎, Yang Jiang b a

b

Gatton College of Business and Economics, University of Kentucky, Lexington, KY 40506, USA Department of Behavioral Science, College of Medicine, University of Kentucky, Lexington, KY 40506, USA Available online 29 October 2008

Abstract Wyer and colleagues in this issue summarize an array of studies that demonstrate varied and significant effects of a series of antecedents on the relative accessibility of visual and verbal strategies in memory and the ease with which these strategies can be employed to affect comprehension and judgments. We review the studies in light of findings and perspectives from psychophysiological and neuroimaging research. We discuss how measurement of event-related potentials (ERP) can illuminate automatic versus reflective forms of processes. Then the revelation of brain regions involved in the visual processing of pictures and words under different tasks and processes by functional brain imaging (e.g. functional MRI). We close by considering these bio-behavioral perspectives and the interactions among modality, task, and affect on visual and verbal encoding strategies among individuals differing in visual and verbal processing orientations. © 2008 Society for Consumer Psychology. Published by Elsevier Inc. All rights reserved.

In a collage of studies, Wyer, Hung, and Jiang (2008) present a model of visual and verbal processing strategies extended to comprehension and judgment along with a substantial body of research reflective of this model. The model posits four antecedents to invoking the two strategies resulting in differences in memory accessibility and ease of processing. The Wyer et al. studies examine a number of important issues and offer a widespread accounting of the issues examined. The scope of the studies and the nature and implications of their model extend far beyond what can be addressed in this commentary. Our focus is on the antecedent related to stimulus modality (pictures versus words) and differences in visual and verbal processing in contexts not situationally constructed (e.g., narratives), such as, those relating to the effects of stimulus characteristics on imagery guided processing. We consider this issue within the contexts of neural mechanisms for visual processing using cognitive neuroscientific approaches such as event-related potentials (ERP) and functional magnetic resonance imaging (fMRI). Before considering this research, we first discuss a semantic model of picture–word processing.

⁎ Corresponding author. E-mail addresses: [email protected] (T.L. Childers), [email protected] (Y. Jiang).

Processing model of pictures versus words Human brain imaging studies have confirmed that categories of visual objects and words are processed in distinct and overlapping regions in the brain. The processing of words has been associated with the left frontal cortex (e.g. Grady, McIntosh, Rajah, & Craik, 1998) and naming words (versus pictures) activated more anterior regions in the ventral temporal lobes and Broca's area (Martin, Wiggs, Ungerleider, & Haxby, 1996). Numerous fMRI studies have reported that processing of visual faces and objects occurs in the occipital and ventral temporal areas including lateral occipital (LO) and fusiform gyrus (e.g. Haxby et al., 2001). Viswanathan and Childers (2003) present a model that addresses object categorization and the privileged access pictures versus words have in categorization. Their model focuses on categorization, but we believe the scope extends into domains of more general semantic processing, as in comprehension and judgment. Viswanathan and Childers (2003) conduct three behavioral experiments that support pictures gain faster category access by simultaneous access to both their concept and visually salient features. In contrast, words may initially access their concept then subsequently may activate the concept's features (see also Glasser & Glasser, 1989). In essence, a picture of a concept is also a picture of its features. By comparison, a word first activates its concept and then may

1057-7408/$ - see front matter © 2008 Society for Consumer Psychology. Published by Elsevier Inc. All rights reserved. doi:10.1016/j.jcps.2008.09.010

T.L. Childers, Y. Jiang / Journal of Consumer Psychology 18 (2008) 264–269

access features if necessary to perform a task (e.g., Are these two objects from different categories?). A key premise is that a picture advantage is not based upon the features themselves, but rather from the semantic processing of those features. Pictures are in essence preprocessed for semantic/categorization judgments, but the nature of their representation is amodal (Viswanathan & Childers, 2003). One counterintuitive insight from the model is the empirically verified finding that words can benefit indirectly from visual similarity. For words, greater semantic relatedness for visually similar versus dissimilar categories can produce an advantage in categorization. By accessing their semantic concept first then, if necessary the instances' features, words are not as likely to suffer from visual feature interference effects. Although pictures have an advantage in semantic access, it is important to note that this advantage is not universal and that the nature of the task as well as the salience of information differentially accessible also affect task performance. The process provided in the Viswanathan and Childers (2003) model parallels results reported by Wyer et al. for the way stimulus characteristics can differentially elicit visual versus verbal processing. Their studies demonstrate that pictures can affect verbal processing and vice versa. The effects might be due to a more conscious reflective processing of this information (e.g., narratives and incongruencies) or as we explain, through more subtle automatic processes. In some instances processes may be accessible to self-report based assessments whereas in others, their automatic nature may be more aptly captured through recent advances in neuroimaging methods. We next consider how these methods can inform the nature of picture–word effects on visual and verbal processing strategies. Psychophysiology assessments through ERP/EEG Electroencephalogram (EEG) is the measurement of electrical activity produced by the brain as recorded from electrodes placed on the scalp. Magnetoencephalography (MEG) is an imaging technique used to measure the magnetic fields produced by electrical activity in the brain via extremely sensitive devices. Evidence from neurobiological measures such as EEG or MEG provides a temporally sensitive indicator of changes in brain activity. Event Related Potentials (ERPs) are averaged EEG voltage changes elicited by brain activities in response to a specific perception or cognitive event (Luck, 2005). Compared to other indirect measures of neural activity (e.g. fMRI, which does not resolve cortical changes shorter than 1–2 s), EEG/ERP and MEG provides a direct measure of brain activity with temporal resolution in the millisecond (ms) range. This methodology is useful for assessing the degree of automatic versus reflective processing rather than treating this is as a strict dichotomy (Lieberman, 2000). In studying picture word effects, researchers in biobehavioral areas use a number of different paradigms or tasks. Understanding how task affects the processing of stimulus characteristics is fundamental to understanding picture word effects and their associated processes. Bargh (1984) provides a

265

set of criteria for differentiating automatic from controlled processes, including the manipulation of consistency (incongruency) of information. Two tasks used frequently in biobehavioral research are incongruency based, affective priming and the oddball paradigm. Both of these tasks relate to the incongruency effects examined by Wyer et al. and also in past picture word studies (c.f., Houston, Childers, & Heckler, 1987; Heckler & Childers, 1992). A principal finding across numerous studies is that prime and target valence interact in affecting the time to judge the target. Similarly, the oddball paradigm is a form of affective priming where valence is manipulated by embedding a stimulus with one valence (e.g., positive) within a series of stimuli (e.g., negative) to create an inconsistency effect. A substantial body of ERP research converges in demonstrating that consistency versus inconsistency across stimuli (primes and target) is a powerful framework for differentiating automatic from reflective information processes (e.g. Lewis, Critchley, Rotshtein, & Dolan, 2006). As we review next, research using ERP often distinguishes between “early” processes that occur in the first 100–200 ms versus “later” processes occurring somewhere between 300–1000 ms after presentation (Cacioppo, Tassinary, & Berntson, 2007). ERP and early processing effects The intervals noted above allow us insight in differentiating more automatic from more reflective visual and verbal initiated processes. Temporally sensitive ERP methods also may enable differentiation of more perceptual (e.g., perceptual and visual fluency — Winkielman, Schwarz, Fazendeiro, & Reber, 2003) from more semantic and evaluative processes (c.f., conceptual fluency, Schwarz, 2004). For instance, using ERP and a visually-based height discrimination task, targets at cued versus non-cued locations significantly enhanced P1 amplitudes occurring 90 to 140 ms after stimulus presentation (Hopfinger & Mangun, 1998). An early posterior negative waveform (P100) was reported to modulate both the positive and negative affective nature of picture stimuli (Schupp, Junghofer, Weike, & Hamm, 2003). Schupp et al. (2007) reported an early negative potential (200–350 ms) for affective stimuli and no interaction between attention and emotion during early processing. These studies demonstrate that visual stimuli (pictures) evoke an early more perceptual, perhaps preattentive, response that is captured as a physiological reaction using ERP. ERP and late processing effects Early studies using a verbally based oddball paradigm have found that evaluative inconsistency between attitudes and target valence elicits the late effect termed the “late positive potential” (LPP; Cacioppo, Crites, Gardner, & Berntson, 1994). Studies converge in finding evaluative inconsistency between attitudes and target valence elicits this LPP. Effects for pictures have also been reported in non-priming studies (e.g. Cuthbert, Schupp, Bradley, Birbaumer, & Lang, 2000). Werheid et al. (2005) reported an enhancement of the LPP following unprimed targets

266

T.L. Childers, Y. Jiang / Journal of Consumer Psychology 18 (2008) 264–269

at parietal electrodes between 500 and 600 ms. This LPP did not occur in a previous affective priming study using verbal stimuli (i.e., Schirmer, Kotz, & Friederici, 2002). Zeelenberg, Wagenmakers, and Rotteveel, (2006) in contrast support the emotional facilitation of word encoding. Cross-modal affective priming for a lexical decision task (Schirmer et al., 2002) and affective priming for an evaluation task of emotional words (Zhang, Guo, Lawson, & Jiang, 2006) evoked larger ERP (N400) responses for emotional inconsistency. This N400 is a wellknown ERP based index of contextual conflict and is often related to semantic processing (e.g., Chwilla, Brown, & Hagoort, 1995). Mirroring ERP findings is the behavioral result for inconsistent ads with lower recall attributable to a reduced ability to engage in elaborative processes (Houston, Childers, & Heckler 1987). Lateralized LPP effects were generated under evaluative (good– bad) versus non-evaluative (abstract–concrete) judgments for socially relevant word concepts in a study by Cunningham, Espinet, DeYoung, and Zelazo (2005). The studies discussed demonstrate that both verbal and visual stimuli elicit a more attentionally driven “later” process (occurring often within 600– 800 ms) that can interact with emotion. Functional MRI-based assessments of visual and verbal processes In contrast to the temporal resolution advantage of ERP, fMRI provides cortical mapping of responses in the whole brain with spatial resolutions as fine as 2–4 mm. Activations are based upon signals generated through the BOLD (blood oxygen level dependent) index with sufficient sensitivity to detect task-induced changes in local brain function. For instance, fMRI studies reveal a lateralization difference across verbal (words) and non-verbal (faces) materials in frontal activation during memory encoding. Words generate more activation in left frontal cortex, whereas faces more in the right frontal cortex. Also, objects (associated with both an image and a name) activate both right and left frontal cortex (c.f., Buckner, Kelley, & Petersen, 1999 for a review). As we discuss next, several studies have examined the role of both pictures and words under affective processing using fMRI based methodologies. Lang et al. (1998) reported that both pleasant and unpleasant pictures elicited more activity than neutral pictures in the occipital and temporal regions of the brain. Bradley et al. (2003) reported stronger fMRI evoked responses for highly arousing emotional pictures with greater activation in posterior ventral regions of the visual cortices. Pictures also generated greater activation of bilateral visual and medial cortices for familiar objects of line drawings (Grady et al., 1998). Overall, pictures were remembered better than words, but under intentional learning and semantic processing, words and pictures were equally remembered (consistent with Childers & Houston 1984 for immediate recall under semantic processing of pictures versus words). Drawing from a review of 35 studies using visual stimuli, Phan, Wager, Taylor, & Liberzon, (2002) reported that 60% of the studies found activation in the occipital regions of the brain with robust findings for valence and arousal. In addition, 50% of

the visually induced affective studies activated the amygdala (see also Zald, 2003). With respect to words, emotional oddballs (aversive words) generated greater activation in the left amygdala and the left inferior prefrontal cortex (Strange, Henson, Friston, & Dolan, 2000). Similarly, Tabert et al. (2001) found unpleasant versus neutral words elevated BOLD signals in the amygdala. Hamann and Mao (2002) extended these findings to the neural correlates of emotional significance for positive words, with greater activations in the left amygdala. Taken together, fMRI studies demonstrate the affective nature of pictures and words elicit elevated brain responses varying by region along with meaningful temporal differences using ERP. Synthesizing across the studies, specific regions are unlikely to be solely responsible for a specific function nor does a particular task activate a single region. With this caveat in mind, visual processing should lead to greater activation of the visual and extrastriate regions and areas in the occipital and temporal cortices. In contrast, verbal processing involves more language processing regions (in part referred to as Broca's and Wernicke areas) in the vicinity of the inferior prefrontal gyrus, prefrontal cortex and anterior portions of the middle temporal cortex (see Kirchoff & Buckner, 2006). Pictures were evidenced very early (e.g., P100) in the process with additional attention allocated to subsequent stages dependent upon their affective significance (LPP). Words demonstrate this relatively later attentional impact, with effects modulated also by their affective significance. In combination, these bio-behavioral techniques can untangle the delicate distinctions we make in verbal and visual processing. Additionally, task effects for valence across judgments can implicate an automatic process as evidenced by behavioral differences in responses and elevated BOLD signals in the amygdala. In contrast, differences between tasks can evidence a reflective or controlled form of evaluative processing (c.f., Cunningham et al., 2005). Bio-behavioral implications — modality, task, and affect Revealing visual and verbal encoding strategies Kirchoff and Buckner's (2006) fMRI based study using pictures has important implications to the present context. In their study participants were instructed to learn interacting picture pairs (c.f., Childers & Houston, 1984) for a subsequent memory task. Two encoding strategies were predominant, a verbal elaboration strategy and a visual inspection strategy. Both strategies uniquely correlated with memory performance and BOLD signals, with the verbal elaboration strategy correlated with activation in prefrontal regions, particularly, the inferior prefrontal gyrus. In contrast, the visual strategy was correlated with visual extrastriate regions, particularly the anterior fusiform cortex. Implications for studying individual differences A number of studies have examined individual variations in brain activity based upon sex differences (c.f., Killgore & Yurgelun-Todd, 2001). Upon their review of 65 studies relating

T.L. Childers, Y. Jiang / Journal of Consumer Psychology 18 (2008) 264–269

to affect, Wager et. al. (2003) conclude that women more frequently exhibit differences in brainstem activation (e.g., midline limbic structures), whereas males showed more lateralization of emotional activity (e.g., left inferior frontal and posterior cortex). Individual differences in visual processing also have been found in neuropsychological studies for “affective style” based upon emotional reactivity (c.f., Davidson, 1998), extraversion and neuroticism with respect to negative and positive pictures (Canli et al., 2001), and among fear phobic versus non-phobic individuals in their processing of arousing positive and negative pictures (Sabatinelli, Bradley, Fitzsimmons, & Lang, 2005). Individuals scoring high on sensation-seeking scales also generated stronger responses to new visual stimuli in visual cortices (Roberti, 2004), but weaker responses in inferior frontal cortex (Jiang et al., 2008). The speed of visual word recognition also varies widely across individuals. A MR technique called diffusion tensor imaging (DTI) provides evidence that reaction time to a lexical decision is associated with the degree to which portions of frontal white matters in the brain are orientated (Gold, Powell, Liang, Jiang, & Hardy, 2007). Extending this discussion to the studies by Wyer et al. raises whether situationally induced manipulations (e.g., “hidden figures” or “hidden words”) converge with studies of individual differences in style of visual versus verbal processing. Style of Processing was conceptualized, as, “a preference and propensity to engage in a verbal and/or visual modality of processing” (Childers, Houston, & Heckler, 1985, p. 130) and its attentional orientation (Heckler, Childers, &

267

Houston, 1993). Our view of SOP differences reflects the chronic accessibility of preferred modality, but also emphasizes the more automatic processing of this preferred modality. This concept of chronic or long-term accessibility is the activation readiness potential of stored information as reflected by longterm processing influences on activation (Higgins 1996; Bargh, Lombardi, & Higgins, 1988). The distinction we make here is relative to the invocation of a strategy that is more consciously monitored and regulated. In our view, SOP is based upon an activation readiness that individuals bring to the stimulus, most often unconsciously and automatically. Thus, the work of Wyer et al. provides an interesting launching point for research on whether (1) conscious strategies versus unconscious preferences reflect similar processes and (2) whether situationally induced versus individually-based visual and verbal processes constitute the same underlying processes even though their outcomes may be similar. Our view can be summarized in a brain and behavioral model on individuals who possess a more visual versus verbal orientation (Fig. 1). As a more automatic process, we predict speedier responses via the visual cortex to the emotional processing structure amygdale for visual processors independent of whether participants make evaluative or non-evaluative judgments. For verbal processors, we would predict the more resource demanding judgments of pictures would only lead to slower responses in the amygdale via this frontal word processing brain region for evaluative judgments. We illustrate this distinction for a behaviorally-based indicator as well (reaction time).

Fig. 1. Style of processing: brain behavioral model. A simplified model illustrating individual differences in SOP and emotional processing (A – Amygdala, subcortical in red). Frontal word processing area in blue for “verbal” individuals (Left) versus preferred occipital-temporal cortex in green for “visual” individuals (Right). All regions are more involved during evaluative processing. Reaction times illustrated in relative scale (lower–faster) across task and modality (word and picture) for visual and verbal processors.

268

T.L. Childers, Y. Jiang / Journal of Consumer Psychology 18 (2008) 264–269

Thus pursuits of individual, situational, task, and stimulus based effects on visual and verbal processes have benefited substantially by the research conducted by Wyer and colleagues and in our view the use of bio-behavioral methodologies offer the same promise to further extend and deepen our understanding of the nature of visual and verbal processes and processing in the brain. References Bargh, J. A. (1984). Automatic and conscious processing of social information. In R. WyerJr., & T. Srull (Eds.), Handbook of social cognition, Vol. 3. (pp. 1−44) Hillsdale, NJ. Bargh, J. A., Lombardi, W. J., & Higgins, E. T. (1988). Automaticity of chronically accessible constructs in person × situation effects on person perception: It's just a matter of time. Journal of Personality and Social Psychology, 55, 599−605. Bradley, M. M., Sabatinelli, D., Lang, P. J., Fitzsimmons, J. R., King, W. M., & Desai, P. (2003). Activation of the visual cortex in motivated attention. Behavioral Neuroscience, 117, 369−380. Buckner, R., Kelley, W. M., & Petersen, S. E. (1999). Frontal cortex contributes to human memory formation. Nature Neuroscience, 2, 311−314. Cacioppo, J. T., Crites, S. L., Jr., Gardner, W. L., & Berntson, G. G. (1994). Bioelectrical echoes from evaluative categorizations: A late positive brain potential that varies as a function of trait negativity and extremity. Journal of Personality and Social Psychology, 67, 115−125. Cacioppo, J. T., Tassinary, L. G., & Berntson, G. G. (2007). Handbook of psychophysiology, 3rd edition New York: Cambridge University Press. Canli, T., Zhao, Z., Desmond, J. E., Kang, E., Gross, J., & Gabrieli, J. D. E. (2001). An fMRI study of personality influences on brain reactivity to emotional stimuli. Behavioral Neuroscience, 115, 33−42. Childers, T. L., & Houston, M. J. (1984). Conditions for a picture superiority effect on consumer memory. Journal of Consumer Research, 11, 643−654. Childers, T. L., Houston, M. J., & Heckler, S. E. (1985). Measurement of individual differences in visual and verbal information processing. Journal of Consumer Research, 12, 125−134. Chwilla, D. J., Brown, C. M., & Hagoort, P. (1995). The N400 as a function of the level of processing. Psychophysiology, 32, 274−285. Cunningham, W. A., Espinet, S. D., DeYoung, C. G., & Zelazo, P. D. (2005). Attitudes to the right-and left: Frontal ERP asymmetries associated with stimulus valence and processing goals. NeuroImage, 28, 827−834. Cuthbert, B. N., Schupp, H. T., Bradley, M. M., Birbaumer, N., & Lang, P. J. (2000). Brain potentials in affective picture processing: Covariation with autonomic arousal and affective report. Biological Psychology, 52, 95−111. Davidson, R. J. (1998). Affective style and affective disorders: Perspectives from affective neuroscience. Cognition and Emotion, 12, 307−320. Glasser, W. R., & Glasser, M. O. (1989). Context effects on Stroop-like word and picture processing. Journal of Experimental Psychology: General, 118, 13−42. Grady, C. L., McIntosh, A. R., Rajah, M. N., & Craik, F. I. M. (1998). Neural correlates of the episodic encoding of pictures and words. Proceedings of the National Academy of Science, 95, 2703−2708. Gold, B., Powell, D. K., Liang, X., Jiang, Y., & Hardy, P. A. (2007). Speed of lexical decision correlates with diffusion anisotropy in left parietal and frontal white matter: evidence from diffusion tensor imaging. Neuropsychologia, 45, 2439−2446. Hamann, S., & Mao, H. (2002). Positive and negative emotional verbal stimuli elicit activity in the left amygdale. NeuroReport, 13, 15−19. Haxby, J. V., Gobbini, M. I., Furey, M. L., Ishai, A., Schouten, J. L., & Pietrini, P. (2001). Distributed and overlapping representations of faces and objects in ventral temporal cortex. Science, 293, 2425−2430. Heckler, S. E., & Childers, T. L. (1992). The role of expectancy and relevancy in memory for verbal and visual information: What is incongruency? Journal of Consumer Research, 18, 475−492.

Heckler, S. E., Childers, T. L., & Houston, M. J. (1993). On the construct validity of the SOP scale. Journal of Mental Imagery, 17, 119−132. Higgins, E. T. (1996). Knowledge activation: Accessibility, applicability, and salience. In E. T. Higgins, & A. W. Kruglanski (Eds.), Social psychology: Handbook of basic principles (pp. 133−168). NY: Guilford. Hopfinger, J. B., & Mangun, G. R. (1998). Reflexive attention modulates processing of visual stimuli in human extrastriate cortex. Psychological Science, 9, 441−447. Houston, M. J., Childers, T. L., & Heckler, S. E. (1987). The effects of picture–word consistency on the elaborative processing of print advertisements. Journal of Marketing Research, 24, 359−369. Jiang, Y., Lianekhammy, J., Lawson, A., Martin, S., Guo, C., Lynam, D., Joseph, J., Gold, B. T., & Kelly, T. H. (2008). Brain responses to repeated visual experience among low and high sensation seekers: Role of boredom susceptibility. Psychiatry Research: NeuroImaging, in press. Killgore, W. D., & Yurgelun-Todd, D. A. (2001). Sex differences in amygdale activation during the perception of facial affect. Neuroreport, 12, 2543−2547. Kirchoff, B. A., & Buckner, R. L. (2006). Functional-anatomical correlates of individual differences in memory. Neuron, 51, 263−274. Lang, P. J., Bradley, M. M., Fitzsimmons, J. R., Cuthbert, B. N., Scott, J. D., Moulder, B., & Nangia, V. (1998). Emotional arousal and activation of the visual cortex: An fMRI analysis. Psychophysiology, 35, 199−210. Lewis, P. A., Critchley, H. D., Rotshtein, P., & Dolan, R. J. (2006). Neural correlates of processing valence and arousal in affective words. Cerebral Cortex, 17, 742−748. Lieberman, M. D. (2000). Intuition: A social cognitive neuroscience approach. Psychological Bulletin, 126, 109−137. Luck, S. J. (2005). Event-related potential technique. Cambridge, MA: The MIT Press. Martin, A., Wiggs, C. L., Ungerleider, L. G., & Haxby, J. V. (1996). Neural correlates of category-specific knowledge. Nature, 379, 649−651. Phan, K. L., Wager, T., Taylor, S. T., & Liberzon, I. (2002). Functional anatomy of emotion: A meta-analysis of emotion activation studies in PET and fMRI. NeuroImage, 16, 331−348. Roberti, J. W. (2004). A review of behavioral and biological correlates of sensation seeking. Journal of Research in Personality, 38(3), 256−279. Sabatinelli, D., Bradley, M. M., Fitzsimmons, J. R., & Lang, P. J. (2005). Parallel amygdala and inferotemporal activation reflect emotional intensity and fear relevance. NeuroImage, 24, 1265−1270. Schirmer, A., Kotz, S. A., & Friederici, A. D. (2002). Sex differentiates the role of emotional prosody during word processing. Cognitive Brain Research, 14, 228−233. Schupp, H. T., Junghofer, M., Weike, A. I., & Hamm, A. O. (2003). Emotional facilitation of sensory processing in the visual cortex. Psychological Science, 14, 7−13. Schupp, H. T., Stockburger, J., Codispoti, M., Junghofer, M., Weike, A. I., & Hamm, A. O. (2007). Selective visual attention to emotion. The Journal of Neuroscience, 27, 1082−1089. Schwarz, N. (2004). Metacognitive experiences in consumer judgment and decision making. Journal of Consumer Psychology, 14, 332−348. Strange, B. A., Henson, R. N. A., Friston, K. J., & Dolan, R. J. (2000). Brain mechanisms for detecting perceptual, semantic, and emotional deviance. NeuroImage, 12, 425−433. Tabert, M. H., Borod, J. C., Tang, C. Y., Lang, G., Wei, T. C., Johnson, R., Nusbaum, A. O., & Buchsbaum, M. S. (2001). Differential amygdale activation during emotional decision and recognition memory tasks using unpleasant words: An fMRI study. Neuropsychologia, 39, 556−573. Viswanathan, M., & Childers, T. L. (2003). An enquiry into the process of categorization of pictures and words. Perceptual and Motor Skills, 96, 267−287. Wager, T., Phan, K. L., Taylor, S. T., & Liberzon, I. (2003). Valence, gender, and lateralization of functional brain anatomy in emotion: A meta-analysis of findings from neuroimaging. NeuroImage, 19, 513−531. Werheid, K., Alpay, G., Jentzsch, I., & Sommer, W. (2005). Priming emotional facial expressions as evidenced by event-related brain potentials. International Journal of Psychophysiology, 55, 209−219. Winkielman, P., Schwarz, N., Fazendeiro, T. A., & Reber, R. (2003). The

T.L. Childers, Y. Jiang / Journal of Consumer Psychology 18 (2008) 264–269 hedonic marking of processing fluency: Implications for evaluative judgment. In J. Musch, & K. C. Klauer (Eds.), The psychology of evaluation: Affective processes in cognition and emotion (pp. 189−217). Mahwah, NJ: Erlbaum. Wyer, R. S., Hung, I. W., & Jiang, Y. (2008). Visual and verbal strategies in comprehension and judgment. Journal of Consumer Psychology, 18, 244−257.

269

Zald, D. H. (2003). The human amygdala and the emotional evaluation of sensory stimuli. Brain Research Reviews, 41, 88−123. Zhang, Q., Guo, C., Lawson, A., & Jiang, Y. (2006). Electrophysiological correlates of visual affective priming. Brain Research Bulletin, 71, 316−323. Zeelenberg, R., Wagenmakers, E. J., & Rotteveel, M. (2006). The impact of emotion on perception. , Psychological Science, 17, 287−291.