Emotional valence modulates activity in the posterior fusiform gyrus and inferior medial prefrontal cortex in social perception

Emotional valence modulates activity in the posterior fusiform gyrus and inferior medial prefrontal cortex in social perception

Available online at www.sciencedirect.com R NeuroImage 18 (2003) 675– 684 www.elsevier.com/locate/ynimg Emotional valence modulates activity in the...

261KB Sizes 0 Downloads 26 Views

Available online at www.sciencedirect.com R

NeuroImage 18 (2003) 675– 684

www.elsevier.com/locate/ynimg

Emotional valence modulates activity in the posterior fusiform gyrus and inferior medial prefrontal cortex in social perception Jacob Geday,a Albert Gjedde,a,b Anne-Sophie Boldsen,a and Ron Kupersa,b,* b

a PET Center, Aarhus University and Aarhus University Hospital, Aarhus, Denmark Center for Functional Integrative Neuroscience (CFIN), The University of Aarhus, Aarhus, Denmark

Received 22 April 2002; revised 19 September 2002; accepted 21 October 2002

Abstract Previous studies have shown that during the presentation of emotionally loaded visual stimuli, activity increases in the visual and limbic cortices. This study focuses on empathic reactions induced by presenting pictures of situations and facial expressions from a “third party” point of view only. We measured regional changes in blood flow (rCBF) in nine healthy subjects while they were looking at neutral, positive, or negative emotional pictures of low (facial expressions) and high (persons in real-life situations) social complexity. A significant rCBF increase occurred in the right posterior fusiform gyrus during presentation of emotional pictures of both low and high social complexity. We also observed an interaction between emotionality and social complexity in the left inferior occipital gyrus for situations, where emotionality produced a significantly larger rCBF increase for situations than faces. No significant rCBF changes were observed in the amygdala or other parts of the limbic system. A significant rCBF decrease was found in the right inferior medial prefrontal cortex during presentation of the emotional pictures. This is discussed with respect to the “default mode of the brain” theory. We suggest that there is a neural network in the posterior fusiform and inferior occipital gyrus specialized in identifying emotionally important visual clues. Messages from this and other areas converge to the medial prefrontal cortex, to be evaluated in terms of relevance for attention. We believe that this is a crucial part of a network used in normal empathic reactions and social interactions. © 2003 Elsevier Science (USA). All rights reserved. Keywords: Emotion; Social perception; Prefrontal cortex; Fusiform gyrus; Empathy; Inferior occipital gyrus

Introduction Emotional social perception, here defined as the ability to assess the emotional valence of a social situation or the emotional state of a fellow human, is of obvious importance for an individual’s survival and well-being. If confronted with a horrible scene or faces expressing despair or anger, it is often appropriate to intervene or to escape. Rolls (1990) proposed the existence of primary reinforcers, universal unlearned rewards and punishers, to explain human emotion and motivation, using especially single-cell recordings of the inferior prefrontal cortex to support this theory. Primary reinforcers would be coupled to learned, often socially de-

* Corresponding author. CFIN, Aarhus University Hospitals, Nørrebrogade 44, DK-8000 Aarhus, Denmark. Fax: ⫹45-8949-4400. E-mail address: [email protected] (R. Kupers).

fined, secondary reinforcers by experience, in such a degree that the secondary reinforcers might end up being perceived as more rewarding or punishing than the primary. Especially for higher primates living in social groups is it important to keep track of the different reinforcers (Rolls, 2000a). Based on findings from numerous neurophysiological studies on humans and animals, Lang and colleagues (1997) introduced the term “natural selective attention,” to describe the fact that any normal individual is more likely to attend to stimuli of evolutionary importance, than to others. When first exposed to a new picture, reaction time responses to probes are significantly slower for emotional than for neutral pictures (Bradley et al., 1992). This tendency to dwell on emotional pictures is also seen in a free-viewing paradigm. When normal subjects themselves can choose how long they want to see a certain picture, unpleasant or pleasant pictures are viewed longer than neutral (referring to

1053-8119/03/$ – see front matter © 2003 Elsevier Science (USA). All rights reserved. doi:10.1016/S1053-8119(02)00038-1

676

J. Geday et al. / NeuroImage 18 (2003) 675– 684

findings by Hamm, 1997, Lang et al., 1998). From an evolutionary point of view, one could postulate the existence of dedicated brain areas for the assignment of emotional valence and attention to social situations. Neuroimaging studies indeed have shown that numerous brain regions are involved in emotional social perception. However, the exact areas that are activated depend on the type of emotion examined. For instance, fearful visual stimuli typically activate the amygdala (Adolphs, 1995), whereas sexually arousing pictures (Rauch et al., 1999), or beautiful (Aharon et al., 2001) or happy faces (Whalen et al., 1998) activate the basal ganglia. The results also depend on whether a cognitive demand is imposed on the subject while they are looking at the emotional stimuli. For instance, the insular region seems to be recruited only when subjects perform a cognitively demanding task in conjunction with emotional perception (Phan et al., 2002). The main objective of this study was to investigate the cerebral response pattern to emotionally loaded social perception. We focused on empathic reactions to social situations and facial expressions from a “third-party” point of view. Therefore, no pictures showing aggression directed toward the observer, sexual activities, or angry or fearful facial expressions were presented. Since none of the pictures represented a direct threat or reward for the observer, the induced emotional state should derive from an assessment of how the depicted persons felt. The second objective was to study whether emotionality in pictures of low social complexity (operationalized as pictures displaying mainly emotional facial expressions) activate different areas compared to emotionality in pictures of high social complexity (operationalized as pictures of emotionally loaded social situations where facial expressions are absent or had little relevance for assessment of the situation). Our regions of interest were the inferior medial prefrontal cortex, the anterior cingulate cortex, the inferior temporal cortex, especially the fusiform gyrus, and the amygdala. All regions have been shown to be activated in emotional activation paradigms (see Phan et al., 2002, for a recent review).

PET data acquisition and analysis Regional cerebral blood flow (rCBF) was measured with an ECAT Exact HR47 PET camera (Siemens/CTI, Knoxville, TN, USA) in 3D mode following fast bolus injections of 500 MBq of H215O into the left antecubital vein. A single 60-s frame was acquired, starting at 60,000 true counts/s. Successive scans were separated by at least 12-min intervals. Visual stimuli were presented throughout the entire scanning window. Subjects were scanned in six different experimental conditions (see further), each repeated twice. PET images were reconstructed after correction for scatter (Watson et al., 1996) and measured attenuation correction. The 47 3.1-mm thick slices were filtered to 16 mm FWHM isotropic (Hanning filter cut-off frequency ⫽ 0.15 cycles/s). PET images were realigned using Automatic Image Registration (AIR) software to correct for head movements between scans (Woods et al., 1992). For anatomical localization of activation sites, T1-weighted magnetic resonance imaging (MRI) was performed on a GE Sigma 1-T scanner providing slices of 1.5-mm thickness. The first PET image was coregistered to each individual’s MRI. PET and MRI data were mapped into standardized stereotaxic space (Talairach and Tournoux, 1988), using a nine-parameter affine transformation. After a pixel-by-pixel regression of PET volumes using the local voxel SD, t-statistical maps were calculated. rCBF measured during the emotional (pleasant and unpleasant) picture series was regressed on a voxel-byvoxel basis against rCBF measured during the neutral pictures. The same was done for grouping, where rCBF measured during facial pictures was regressed against rCBF measured during situational pictures. Corrected P values for local maxima were calculated according to the method described by Worsley et al. (1996) for image volumes with a nonuniform SD. It should be noted that the t-threshold values differ according to the number of scans used in the regressions. In addition to the group analysis, a singlesubject analysis was also performed for all significant local maxima found in the group analysis. Activity was measured in a sphere (radius 4 mm) around the x, y, z stereotaxic coordinates of each significant local maximum and normalized by subtracting the mean activity in the scan.

Methods Visual stimuli Subjects Nine male subjects (mean age: 43 years, SD: 11.5 years) participated in the study. The study was approved by the local ethics committee and subjects gave written informed consent. All subjects were unmedicated and without known psychiatric or neurological illnesses. The PET data of one subject were excluded from the group analysis because of major anatomical differences between the subject’s brain anatomy and the average brain template. Manual identification of the anatomical regions of interest however made it possible to include this subject in the single-subject analysis.

We used the Empathy Picture System (EPS), which was developed in our laboratory (Geday et al., 2001). The EPS consists of 12 picture series, each containing 30 pictures showing real-life persons in real-life events. All pictures were selected from international newspaper and private photo databases. Pictures were divided according to their social complexity. Half of the picture series show mainly faces (low social complexity) and the remaining show social situations (high social complexity). Picture series were further subdivided according to their emotional valence (positive, negative, or neutral). No person

J. Geday et al. / NeuroImage 18 (2003) 675– 684

appeared in more than one picture. Great care was taken to match the picture series in terms of complexity across the three emotional valence categories. Since the number of persons per picture is higher in the situational compared to the facial sets, the former tend to be more complex. All situational pictures are of the “third-party perspective” type only, semantically oriented away from the observer. None of the pictures shows erotic activities or scenes that can be perceived as a direct threat or reward by the observer. For the facial pictures the emotional categories are sadness/despair, neutral, and happiness. The situational pictures show situations that are disgusting/horrible, neutral, or nice/happy. More concrete examples and validation procedures are available at www.geday.net/EPS.

677

Table 2 Significant rCBF changes for the contrast “situations” versus “faces” Anatomical region

BA

x

y

z

Situations–Faces t

rCBF increases Middle occipital/temporal 19 gyrus Cuneus Lingual gyrus Fusiform gyrus rCBF decreases Middle temporal gyrus Superior temporal gyrus

⫺33 ⫺81

P

18

7.42 ⬍0.00001

⫺77 21 ⫺91 6 ⫺88 ⫺9 ⫺76 ⫺13 ⫺55 ⫺12

6.60 ⬍0.00001 5.21 0.01 4.96 0.01 5.35 0.00 4.96 0.01

17 18 19 19

39 13 ⫺12 ⫺24 ⫺25

21 22

⫺52 ⫺14 ⫺15 ⫺4.65 ⫺54 12 0 ⫺4.36

0.05 0.05

Stimulus presentation Results During PET scanning, the 12 picture series were presented in a randomized order. During each scan, one EPS picture series was presented. The pictures were presented one at a time on a 21-inch color monitor placed 70 cm from the subjects’ eyes. Each picture was shown for 3 s and was immediately followed by the next one. The 3-s display period without gap between successive pictures was crucial to our study design and was inspired by the theoretical framework of Po¨ ppel (see Discussion). The subjects were instructed to look carefully at the pictures, but no explicit recognition or categorization was requested. Immediately after every scan, the subjects had to indicate whether they agreed with the grouping and emotional valence of the pictures (e.g., “do you agree that you saw pictures of faces you felt were unpleasant?”). After the PET investigation, the subjects were shown the pictures again and they had to evaluate each of the pictures on a scale from ⫺3 to 3 with ⫺3 as most unpleasant, 0 as neutral, and 3 as most pleasant. In addition, subjects were asked to categorize each picture as either “facial” or “situational.”

Table 1 Emotional valence ratings of pictures Group

Valence

Scores

% agreement

1% confidence interval

Facial

Pleasant

1,28 1,29 0,21 0,08 ⫺1,41 ⫺1,18 1,03 0,99 0,14 0,17 ⫺2,17 ⫺1,87

100 100 91 99 96 98 94 99 94 97 99 99

0,11 0,11 0,11 0,10 0,16 0,14 0,18 0,17 0,19 0,17 0,22 0,20

Neutral Unpleasant Situation

Pleasant Neutral Unpleasant

Behavioral tests There was a complete agreement on the general grouping and valence of each of the 12 EPS picture series shown during the PET study. The individual rating of each picture after the PET study also showed a high agreement in terms of grouping (Table 1). The average agreement of all pictures was 97.2 ⫾ 2.8% (range 91–100%). The highest agreement was found for the emotional (pleasant and unpleasant) facial pictures. The unpleasant situational pictures were rated more unpleasant than the unpleasant facial pictures (P ⬍ 0.01; unpaired Student t test). Conversely, pleasant facial pictures were rated significantly more pleasant than pleasant situational pictures (P ⬍ 0.05; unpaired Student t test). PET data analysis Situational versus facial pictures A direct comparison of all situational with facial scans revealed highly significant bilateral rCBF increases in a region covering the middle occipito-temporal gyri (BA 19) during the presentation of situational pictures (Table 2). In addition, we also observed significant rCBF increases in the cuneus, the left fusiform, and the lingual gyrus. In contrast, the anterior part of the left temporal lobe was more activated by faces than situations. The focus of this activation was situated in the upper part of the superior temporal sulcus, extending into the middle temporal gyrus. Emotional versus neutral pictures Fig. 1 and Table 3 show rCBF changes when comparing the emotional with the neutral pictures. A significant rCBF increase was observed in the posterior part of the right fusiform gyrus during emotional picture presentation (Fig. 1). A significant rCBF decrease occurred in the right inferior

678

J. Geday et al. / NeuroImage 18 (2003) 675– 684

Fig. 1. Significant rCBF increase in the right posterior fusiform gyrus during presentation of emotional compared to neutral stimuli. (Left) Result of the group analysis; (right) results of the individual subject analyses. A rCBF increase was observed in all nine subjects.

medial prefrontal cortex (Fig. 2). Single-subject analyses confirmed both findings (Figs. 1 and 2, right). When analyzing separately the pleasant and unpleasant series, it appeared that the rCBF increase in the right fusiform area was significant for the unpleasant but not for the pleasant pictures. Using the individual ratings in the PET regression analysis did not significantly alter the t-map results compared to the use of a simple regression with regressor values 1 and ⫺1. The only difference was a nonsignifi-

cant trend toward slightly lower fusiform activations and more pronounced medial inferior prefrontal deactivations. The same holds true for the other analyses. Table 4 shows rCBF changes caused by the emotional content separately for the facial and situational pictures. As can be seen, the rCBF increase in the fusiform gyrus by emotional content was more pronounced for the situational than for the facial pictures. The rCBF decrease in the right medial inferior prefrontal cortex gyrus was

Table 3 Effect of emotional valence on picture processing (faces and situations combined) Anatomical region

rCBF increases Fusiform gyrus rCBF decreases Medial frontal gyrus Superior frontal gyrus Posterior dorsal insula

BA

x

y

z

Emotional–Neutral

Unpleasant–Neutral

Pleasant–Neutral

t

t

t

P

P

19

42

⫺72

⫺9

4.87

0.01

4.95

10 11 11 13

15 14 13 ⫺31

51 49 52 ⫺21

⫺8 ⫺14 ⫺11 23

⫺5.23 ⫺4.37 ⫺4.75 ⫺4.03

0.01 0.05 0.01 ns

⫺6.14 ⫺5.30 ⫺5.35

0.01 0.0001 0.001 0.001

P 3.75

ns

⫺3.55

ns

⫺3.16

ns

J. Geday et al. / NeuroImage 18 (2003) 675– 684

679

Fig. 2. Significant rCBF decrease in the inferior medial prefrontal cortex during presentation of the emotional compared to the neutral stimuli. (Left) Result of the group analysis; (right) results of the individual subject analyses. A rCBF decrease was observed in all nine subjects.

significant for the unpleasant situational series but remained below the threshold of statistical significance for the unpleasant facial pictures. Compared to neutral situations, unpleasant situations significantly activated the left inferior occipital gyrus near the fusiform gyrus. In contrast, pleasant situations significantly activated the left precuneus. No rCBF changes were observed in these regions when emotional faces were compared with neutral faces.

Interaction between emotionality and picture complexity The interaction between emotionality and picture complexity was assessed by the following contrast: (emotional situations–neutral situations) ⫺ (emotional faces–neutral faces). Compared to faces, the activation elicited by emotional situational series produced a significantly larger rCBF increase in the left inferior occipital gyrus (Table 5 and Fig. 3). The single-subject analyses revealed this to be the case for eight of the nine subjects (Fig. 3, right).

Table 4 Effect of emotional valence on “facial” and “situational” picture processing Anatomical region

BA

x

y

z

Pleasant minus neutral

Unpleasant minus neutral

Face

Face

t rCBF increases Fusiform gyrus Inferior occipital gyrus Retrosplenial cingulum/precuneus rCBF decreases Medial frontal gyrus

19 18 31/7 10

41 ⫺42 ⫺11

⫺74 ⫺80 ⫺57

⫺7 ⫺6 36

15

49

⫺9

Situation P

t

P

4.17 3.16 6.13

ns ns 0.01

t

Situation P

3.12

⫺4.75

ns

0.05

t

P 3.87 6.68

⫺4.02

ns 0.005

ns

680

J. Geday et al. / NeuroImage 18 (2003) 675– 684

Table 5 Interaction between picture emotionality and complexity Anatomical region

BA

Inferior temporal/occipital gyrus Superior temporal gyrus

19 21

x

y

z

t

P

⫺42 ⫺75 ⫺2 4.35 0.05 ⫺60 1 ⫺12 ⫺4.28 0.05

Discussion This study investigated the effect of emotional valence on neural activation during visual processing of social stimuli. As predicted, we found that activity in the fusiform gyrus was modulated by the emotional valence, regardless of whether the images showed facial expressions or complex social interactions. Our methodology differs in three important aspects from the majority of previous studies on emotional processing. First, we made a strict distinction between simple (faces) and complex (situations) social stimuli. Whereas the former predominantly show facial expressions, the latter show persons in real-life situations. Previous studies either exclusively studied responses to emotional facial expressions (Blair et al., 1999; Morris et al., 1998), or used a mixture of faces, situations, and objects (e.g., Lane et al., 1997a; Paradiso, 1999). Second, we deliberately avoided the use of a cognitive task in conjunction with the presentation of the stimuli because of the known interactions between cognition and emotion (Simpson et al., 2000). Third, we presented stimuli for a relatively short period (3 s) with no gap between the images. We hypothesize that the lack of a specific cognitive instruction together with the brief presentation time profoundly influenced the activity in the inferior medial prefrontal cortex (see further). Po¨ ppel (1997) showed that it takes only 3 s to perceive a complex visual scene. When images are shown for a longer period, cognitive mechanisms that try to come to an alternative interpretation of the stimulus are elicited (Po¨ ppel, 1997). Others have obtained results indicating that 3 s might be a critical time period in emotional perception; e.g., Bradley and colleagues (1993) reported that within a 6-s presentation window of an affective picture, blink inhibition was reduced after 3 s. Codispoti and colleagues (2001) further showed that startle reflex potentiation for briefly displayed (500 ms) aversive pictures was maximal 3 s after picture off-set. Fusiform gyrus The results of this study show that activity in the posterior fusiform area is modulated by emotional valence. This is in line with results of some previous neuroimaging studies (Dolan et al., 1996; Halgren et al., 2000; Iidaka et al., 2001; Paradiso et al., 1999; Vuilleumier et al., 2001). The present study makes two major additions to the earlier findings. First, our data show that the role of the fusiform gyrus in emotional visual perception is more general than

hitherto acknowledged. Whereas previous studies showed increased activity in the right fusiform area for communicative signals based on facial gestures, the present data provide evidence for a more general role of this area in emotional social perception. Second, the right fusiform gyrus was more active than the left during emotional processing, irrespective of whether faces or social complex stimuli were presented. This is in line with the general view that the right hemisphere is specialized in emotional processing (Adolphs et al., 1996; Ahern et al., 1991; Kolb and Taylor, 1981). The rCBF increase was more pronounced for the unpleasant and situational than for the pleasant and facial pictures. Whereas the rCBF increase in the right fusiform gyrus was significant for the situational pictures, it remained just below threshold for statistical significance for the facial pictures. This might to some extent be explained by the lower valence ratings in the facial compared to the situational pictures. Using a mixture of facial and nonfacial stimuli, Lane and colleagues (1997b) reported a significant bilateral rCBF increase in the occipito-temporal cortex for unpleasant but not for pleasant pictures. Their stereotaxic coordinates for the right occipito-temporal cortical activation (x ⫽ 48; y ⫽ ⫺68; z ⫽ ⫺4) much resemble our posterior fusiform activation (x ⫽ 42; y ⫽ ⫺72; z ⫽ ⫺9). Paradiso and colleagues (1999) obtained very similar results. The studies differ, however, because we observed a significant right-sided increase only, while Lane and colleagues reported a bilateral rCBF increase. A possible explanation for this difference might be the number of situational pictures in both series. When we tested our data for the situational pictures only, we also observed a significant left occipito-temporal rCBF increase for the unpleasant compared to the neutral pictures (see below). Nakamura and colleagues (2000) showed that the posterior part of the fusiform gyrus not only responds to faces but also to complex scenes. They concluded that this area is mainly involved in the extraction of physical features of complex visual images. Since our situational pictures are more complex than the facials, our findings are in line with this hypothesis. Another possible explanation is that the unpleasant situational pictures were considered more arousing than the facials. Facial and situational pictures differed in the left inferior occipital activation: unpleasant compared to neutral situations significantly activated the left, inferior occipital gyrus (BA 18). A similar but nonsignificant trend was found when comparing pleasant to neutral situations. No rCBF changes were found in this region when comparing unpleasant (or pleasant) with neutral faces. Accordingly, the interaction analysis shows a significantly greater contribution from the left inferior temporal gyrus for the situational compared to the facial pictures. This supports the general notion that the left hemisphere is more activated by semantic processing (e.g., Billingsley et al., 2001). Since our situational pictures are semantically more complex than the facials, they need more processing.

J. Geday et al. / NeuroImage 18 (2003) 675– 684

Prefrontal cortex An unexpected finding was that activity in the inferior medial prefrontal cortex was significantly lower during presentation of emotional compared to the neutral pictures. Several other studies reported rCBF increases in this area during emotional processing (e.g., Damasio, 1994; Iidaka, 2001; Lane et al., 1997a, 1997b; Reiman et al., 1997; Rolls, 2000b). A recent meta-analysis of studies on emotional processing showed that the medial prefrontal cortex is the most frequently activated brain area in response to emotional stimuli, especially when a parallel cognitive task is involved (Phan et al., 2002). We deliberately avoided the use of an explicit cognitive task during presentation of the images. We further tried to minimize the occurrence of idiosyncratic cognitive activity by using a 3-s presentation time without gap between successive stimuli (see Bradley et al., 1993; Codispoti et al., 2001; Po¨ ppel, 1997). Our findings are therefore compatible with the hypothesis of “an attentional role of the medial prefrontal cortex” (Drevets and Raichle, 1998; Simpson et al., 2001a, 2001b). According to this theory, the main task of the medial prefrontal region is to maintain attention and choose between relevant inputs from other brain areas. Therefore in a “perception only” paradigm like this, salient—and from an evolutionary point of view— high-priority messages conveying emotional information facilitate the attentional choice for the emotional compared to the neutral pictures, resulting in a relatively lower activity in the medial inferior prefrontal cortex. The same theory also explains why a rCBF increase may occur in this region when an explicit cognitive task is given in conjunction with the emotional pictures. In this case, the task of the inferior prefrontal cortex becomes more difficult because of a dual source emotional and cognitive input. In addition, the cognitive task is often conflicting or at odds with the emotional content of the images (e.g., a bloody war scene where subjects have to decide whether it is an outdoor or indoor scene), which further increases the attentional conflict. In a recent study (Geday et al., in preparation), we used the same experimental design as in the present study but with the difference that the subjects had seen the pictures before the start of the study. Interestingly, the prefrontal deactivation that was observed in this study was no longer present. In this case, the medial prefrontal cortex receives messages from both emotional- and memory-related areas, which makes the attentional choice equally difficult for the emotional and neutral pictures. Studies that reported a rCBF increase in the medial prefrontal cortex all displayed images for longer than 3 s (e.g., Lane et al. (1997a): 6 s; Lang et al. (1998): 12 s; Taylor et al. (2000): 5 s), or added a time gap between successive stimuli, bringing the total time over 3 s (e.g., Blair et al. (1999): 3 s ⫹ 2-s pause). In contrast, Paradiso and colleagues (1999) displayed pictures only for 2 s without gap between the pictures, and reported a significant

681

deactivation of the medial prefrontal cortex during presentation of the emotional pictures. According to Po¨ ppel (1997), cognitive processes that try to come to an alternative interpretation of the stimulus are elicited when an image is presented for more than 3 s. In this sense, a stimulus presentation duration of more than 3 s is conceptually similar to a situation where subjects are asked to perform a parallel cognitive task. This may explain why Paradiso and we reported a rCBF decrease in the medial prefrontal cortex, while studies using longer stimulus presentation times or a parallel cognitive task have reported a relative increase in activity in this area. Any cognitive task— explicit or implicit—would represent a dual input to the medial prefrontal cortex during the emotional picture presentation. In contrast, when neutral images are shown, the cognitive task is the only “important” input. This makes the attentional choice relatively more difficult for the emotional pictures, resulting in a relatively higher activity in the medial inferior prefrontal cortex. Amygdala We did not find significant rCBF increases in the amygdala or other parts of the limbic system, during the presentation of emotional pictures. Even when using a very low t threshold, no trend of amygdalar activation was found. Changing FWHM to 12 mm did not alter this. Several positron emission tomography and functional magnetic resonance imaging (fMRI) studies have shown amygdalar activity during the presentation of visual emotional stimuli (Dolan et al., 1996; Iidaka et al., 2001; Morris et al., 1998; Paradiso et al., 1999; Phillips et al., 1997; Vuilleumier et al., 2001). However, there is little consensus on the exact role of the amygdala in visual emotional processing. A possible reason for this may be that separate neural systems mediate the recognition of different types of emotions. Adolphs and colleagues (1994) showed that damage to the amygdala preferentially impairs the recognition of fearful facial expressions while leaving the recognition of other emotions relatively intact. Furthermore, it has been shown that fearful faces preferentially activate the (left) amygdala. A study by Phillips and colleagues (1997) showed amygdalar activation for fearful faces but not for disgust. A recent meta-analysis (Phan et al., 2000) confirmed that fear is the emotion that most consistently activates the amygdala. Our negative emotional pictures mostly showed facial expressions of sadness and sorrow or disgusting and horrible “third-person perspective” scenes from wars or disasters. This may be another reason why we failed to show amygdalar activation. Methodological reasons may be invoked as well. Since the amygdalar response habituates rapidly (Breiter et al., 1996; Irwin et al., 1996), an attenuation of the amygdalar response may have occurred as our emotional stimuli were constantly shown during the 60-s scanning window.

682

J. Geday et al. / NeuroImage 18 (2003) 675– 684

Fig. 3. Significant interaction between social complexity and emotional valence in the left inferior occipital gyrus. (Left) Result of the group analysis; (right) this interaction was observed in eight of the nine subjects.

Methodological remarks PET studies often imply the use of a block design. This may engender mental sets and may pose the problems of habituation and anticipation. These problems can be circumvented by using event-related fMRI designs. Notwithstanding these potential pitfalls, fMRI studies have not shown substantially different results for the areas that showed significant rCBF changes in this study. A possible exception concerns the amygdala (see above). However, future studies aimed at directly comparing activations elicited by a block design and an event-related design are needed to further clarify this. We did not use the more commonly used International Affective Picture Series because it was not possible to extract the required number of pictures for each of the six categories. The pictures of the EPS are matched in terms of semantics and social complexity, but have not been validated for physical complexity (jpg size, spatial frequency distribution, etc.). Taylor and colleagues (2000) recently addressed this issue by comparing emotional and neutral stimuli that were matched in terms of physical characteristics. They found increased activity in the posterior fusiform

area for aversive stimuli that were matched in terms of visual complexity. Since we did not measure eye movements during scanning, it might be argued that our results are confounded by eye movement artifacts. Using an activation paradigm similar to ours and measuring at the same time eye movements, Lang and colleagues (1998b) were able to demonstrate a right posterior fusiform activation, which correlated with emotional valence but not with eye movements. Taken together, the most parsimonious explanation of our data is that they result from differences in emotional valence rather than from differences in stimulus complexity or eye movements.

Conclusion Our data suggest that the posterior fusiform area may be involved in a broader identification of emotionally important clues in social perception. Although this is mainly a right-hemispheric phenomenon, the left fusiform gyrus also becomes activated as social complexity increases. Messages from these areas and others converge on the right inferior medial prefrontal cortex, to be evaluated in terms of rele-

J. Geday et al. / NeuroImage 18 (2003) 675– 684

vance for attention. We believe that this network is crucial to normal empathic reactions and social interactions. Acknowledgments We thank Chris Frith (FIL, London) and Richard Lane (Arizona) for helpful comments on an earlier version of the manuscript. We thank the technical staff for skillful assistance during PET scanning. We also thank Peter Neelin (McConnell Brain Imaging Centre, Montreal) and Flemming Andersen (PET Centre, Aarhus) for invaluable help with statistical analysis. Finally, we thank “Nordfoto,” “Jyllands-posten,” and Alan Rowoth for kindly letting us use their photo material in the EPS. References Adolphs, R., Damasio, H., Tranel, D., Damasio, A.R., 1996. Cortical systems for the recognition of emotion in facial expressions. J. Neurosci. 16, 7678 –7687. Adolphs, R., Tranel, D., Damasio, H., Damasio, A., 1994. Impaired recognition of emotion in facial expressions following bilateral damage to the human amygdala. Nature 372, 669 – 672. Adolphs, R., Tranel, D., Damasio, H., Damasio, A.R., 1995. Fear and the human amygdala. J. Neurosci. 15, 5879 –5891. Aharon, I., Etcoff, N., Ariely, D., Chabris, C.F., O’Connor, E., Breiter, H.C., 2001. Beautiful faces have variable reward value: fMRI and behavioral evidence. Neuron 32, 537–551. Ahern, G.L., Schomer, D.L., Kleefield, J., Blume, H., Cosgrove, G.R., Weintraub, S., Mesulam, M.-M., 1991. Right hemisphere advantage for evaluating emotional facial expressions. Cortex 27, 193–202. Billingsley, R.L., McAndrews, M.P., Crawley, A.P., Mikulis, D.J., 2001. Functional MRI of phonological and semantic processing in temporal lobe epilepsy. Brain 124, 1218 –1227. Blair, R.J., Morris, J.S., Frith, C.D., Perrett, D.I., Dolan, R.J., 1999. Dissociable neural responses to facial expressions of sadness and anger. Brain 122, 883– 893. Bradley, M.M., Cuthbert, B.N., Lang, P.J., 1993. Pictures as prepulse: attention and emotion in startle modification. Psychophysiology 30, 541–545. Bradley, M.M., Greenwald, M.K., Petry, M.C., Lang, P.J., 1992. Remembering pictures: pleasure and arousal in memory. J. Exp. Psychol. Learn Mem. Cogn. 18 (2), 379 –390. Breiter, H.C., Etcoff, N.L., Whalen, P.J., Kennedy, W.A., Rauch, S.L., Buckner, R.L., Strauss, M.M., Hyman, S.E., Rosen, B.R., 1996. Response and habituation of the human amygdala during visual processing of facial expression. Neuron 17, 875– 887. Codispoti, M., Bradley, M.M., Lang, P.J., 2001. Affective reactions to briefly presented pictures. Psychophysiology 38, 474 – 478. Critchley, H.D., Elliott, R., Mathias, C.J., Dolan, R.J., 2000. Neural activity relating to generation and representation of galvanic skin conductance responses: a functional magnetic resonance imaging study. J. Neurosci. 20, 3033–3040. Damasio, A.R., 1994. Descartes’ Error: Emotion, Reason, and the Human Brain. Putnam’s, New York. Dolan, R.J., Fletcher, P., Morris, J., Kapur, N., Deakin, J.F., Frith, C.D., 1996. Neural activation during covert processing of positive emotional facial expressions. NeuroImage 4, 194 –200. Drevets, W.C., Raichle, M.E., 1998. Reciprocal suppression of regional cerebral blood flow during emotional versus higher cognitive processes: implications for interaction between emotion and cognition. Cogn. Emotion 12, 353–385.

683

Geday, J.A., Ehlers, L., Boldsen, A.S., Gjedde, A., 2001. The inferior temporal and orbiltofrontal cortex in analysing emotional pictures. NeuroImage 13, S406. Halgren, E., Raij, T., Marinkovic, K., Jousmaki, V., Hari, R., 2000. Cognitive response profile of the human fusiform face area as determined by MEG. Cereb. Cortex 10, 69 – 81. Hamm, A.O., Cuthbert, B.N., Globisch, J., Vaitl, D., 1997. Fear and the startle reflex: blink modulation and autonomic response patterns in animal and mutilation fearful subjects. Psychophysiology 34, 97–107. Iidaka, T., Omori, M., Murata, T., Kosaka, H., Yonekura, Y., Okada, T., Sadato, N., 2001. Neural interaction of the amygdala with the prefrontal and temporal cortices in the processing of facial expressions as revealed by fMRI. J. Cogn. Neurosci. 15, 1035–1047. Irwin, W., Davidson, R.J., Lowe, M.J., Mock, B.J., Sorenson, J.A., Turski, P.A., 1996. Human amygdala activation detected with echo-planar functional magnetic resonance imaging. NeuroReport 7, 1765–1769. Kolb, B., Taylor, L., 1981. Affective behavior in patients with localized cortical excisions: role of lesion site and side. Science 214, 89 –91. Lane, R.D., Reiman, E.M., Bradley, M.M., Lang, P.J., Ahern, G.L., Davidson, R.J., Schwartz, G.E., 1997a. Neuroanatomical correlates of pleasant and unpleasant emotion. Neuropsychologia 35, 1437–1444. Lane, R.D., Reiman, E.M., Ahern, G.L., Schwartz, G.E., Davidson, R.J., 1997b. Neuroanatomical correlates of happiness, sadness, and disgust. Am. J. Psychiatry 154, 926 –933. Lang, P.J., Bradley, M.M., Cuthbert, B.N., 1998a. Emotion, motivation and anxiety: brain mechanisms and psychophysiology. Biol. Psychiatry 44, 1248 –1263. Lang, P.J., Bradley, M.M., Fitzsimmons, J.R., Cuthbert, B.N., Scott, J.D., Moulder, B., Nangia, V., 1998b. Emotional arousal and activation of the visual cortex: an fMRI analysis. Psychophysiology 35, 199 –210. Lang, P.J., et al., 1997. Motivated attention: affect, activation and action, in: Lang, P.J., et al. (Eds.), Attention and Orienting: Sensory and Motivational Processes, Erlbaum, Hillsdale, NJ, pp. 97–135. Morris, J.S., Friston, K.J., Bu¨ chel, C., Frith, C.D., Young, A.W., Calder, A.J., Dolan, R.J., 1998. A neuromodulatory role for the human amygdala in processing emotional facial expressions. Brain 121, 47–57. Nakamura, K., Kawashima, R., Sato, N., Nakamura, A., Sugiura, M., Kato, T., Hatano, K., Ito, K., Fukuda, H., Schormann, T., Zilles, K., 2000. Functional delineation of the human occipito-temporal areas related to face and scene processing. A PET study. Brain 123, 1903–1912. Paradiso, S., Johnson, D.L., Andreasen, N.C., O’Leary, D.S., Watkins, G.L., Ponto, L.L., Hichwa, R.D., 1999. Cerebral blood flow changes associated with attribution of emotional valence to pleasant, unpleasant, and neutral visual stimuli in a PET study of normal subjects. Am. J. Psychiatry 156, 1618 –1629. Phan, K.L., Wager, T., Taylor, S.F., Liberzon, J., 2002. Functional neuroanatomy of emotion: a meta-analysis of emotion activation studies in PET and fMRI. NeuroImage 16, 331–348. Phillips, M.L., Young, A.W., Senior, C., Brammer, M., Andrew, C., Calder, A.J., Bullmore, E.T., Perrett, D.L., Rowland, D., Williams, S.C., Gray, J.A., David, A.S., 1997. A specific neural substrate for perceiving facial expressions of disgust. Nature 389, 495– 498. Po¨ ppel, E., 1997. A hierarchical model of temporal perception. Trends Cogn. Sci. 1, 56 – 61. Raichle, M.E., MacLeod, A.M., Snyder, A.Z., Powers, W.J., Gusnard, D.A., Shulman, G.L., 2001. A default mode of brain function. Proc. Natl. Acad. Sci. USA 98, 676 – 682. Rauch, S.L., Shin, L.M., Dougherty, D.D., Alpert, N.M., Orr, S.P., Lasko, M., Macklin, M.L., Fischman, A.J., Pitman, R.K., 1999. Neural activation during sexual and competitive arousal in healthy men. Psychiatry Res. 30, 1–10. Reiman, E.M., Lane, R.D., Ahern, G.L., Schwartz, G.E., Davidson, R.J., Friston, K.J., Yun, L.S., Chen, K., 1997. Neuroanatomical correlates of externally and internally generated human emotion. Am. J. Psychiatry 154, 918 –925. Rolls, E.T., 1990. A theory of emotion and its application to understand the neural basis of emotion. Cogn. Emot. 4, 161–190.

684

J. Geday et al. / NeuroImage 18 (2003) 675– 684

Rolls, E.T., 2000a. Precis of the brain and emotion. Behav. Brain Sci. 23, 177–191. Rolls, E.T., 2000b. Memory systems in the brain. Annu. Rev. Psychol. 51, 599 – 630. Simpson, J.R., Ongur, D., Akbudak, E., Conturo, T.E., Ollinger, J.M., Snyder, A.Z., Gusnard, D.A., Raichle, M.E., 2000. The emotional modulation of cognitive processing: an fMRI study. J. Cogn. Neurosci. 12 (Suppl. 2), 157–170. Simpson, J.R., Snyder, A.Z., Gusnard, D.A., Raichle, M.E., 2001a. Emotion-induced changes in human medial prefrontal cortex: I. During cognitive task performance. Proc. Natl. Acad. Sci. USA 98, 683– 687. Simpson, J.R., Drevets, W.C., Snyder, A.Z., Gusnard, D.A., Raichle, M.E., 2001b. Emotion-induced changes in human medial prefrontal cortex: II. During anticipatory anxiety. Proc. Natl. Acad. Sci. USA 98, 688– 693. Talairach, J., Tournoux, P., 1988. A Co-planar Stereotaxic Atlas of the Human Brain. Thieme, Stuttgart. Taylor, S.F., Liberzon, I., Koeppe, R.A., 2000. The effect of graded aversive stimuli on limbic and visual activation. Neuropsychologia 38, 1415–1425.

Vuilleumier, P., Armony, J.L., Driver, J., Dolan, R.J., 2001. Effects of attention and emotion on face processing in the human brain: an event-related fMRI study. Neuron 30, 829 – 841. Watson, C.C., Newport, D., Casey, M.E., 1996. Scatter stimulation technique for scatter correction in 3D PET, in: Grangeat, P., Amans, J.L. (Eds.), Three-Dimensional Image Reconstruction in Radiology and Nuclear Medicine, Kluwer Academic, Dordrecht, pp. 255–268. Whalen, P.J., Rauch, S.L., Etcoff, N.L., McInerney, S.C., Lee, M.B., Jenike, M.A., 1998. Masked presentations of emotional facial expressions modulate amygdala activity without explicit knowledge. J. Neurosci. 18, 411– 418. Woods, R.P., Cherry, S.R., Mazziotta, A., 1992. Rapid automated algorithm for aligning and reslicing PET images. J. Comput. Assist. Tomogr. 16, 620 – 633. Worsley, K.J., Marrett, S., Neelin, P., Vandal, A.C., Friston, K.J., Evans, A.C., 1996. A unified statistical approach for determining significant signals in images of cerebral activation. Human Brain Mapping 4, 458 – 473.