Neuroscience Letters 333 (2002) 13–16 www.elsevier.com/locate/neulet
Gender differences in neural correlates of recognition of happy and sad faces in humans assessed by functional magnetic resonance imaging Tatia M.C. Lee a,b,*, Ho-Ling Liu c,d, Rumjahn Hoosain e, Wan-Ting Liao d, Chien-Te Wu d, Kenneth S.L. Yuen a, Chetwyn C.H. Chan f, Peter T. Fox g, Jia-Hong Gao g a
Neuropsychology Laboratory, Department of Psychology, The University of Hong Kong, Pokfulam Road, Hong Kong, Hong Kong b Sau Po Center on Ageing, The University of Hong Kong, Hong Kong, Hong Kong c Department of Medical Technology, Chang Gung University, Tao-Yuan, Taiwan d MRI Center, Chang Gung Memorial Hospital, Tao-Yuan, Taiwan e Cognitive Psychology Laboratory, Department of Psychology, The University of Hong, Hong Kong, Hong Kong f Department of Rehabilitation Sciences, The Hong Kong Polytechnic University, Hong Kong, Hong Kong g Research Imaging Center, University of Texas Health Science Center, San Antonio, TX, USA Received 19 June 2002; received in revised form 20 August 2002; accepted 21 August 2002
Abstract To examine the effect of gender on the volume and pattern of brain activation during the viewing of alternating sets of faces depicting happy or sad expressions, 24 volunteers, 12 men and 12 women, participated in this functional magnetic resonance imaging study. The experimental stimuli were 12 photographs of Japanese adults selected from Matsumoto and Ekman’s Pictures of Facial Affect. Four of these pictures depicted happy facial emotions, four sad, and four neutral. Half of the photographs were of men and the other half were of women. Consistent with previous findings, distinct sets of neural correlates for processing happy and sad facial emotions were noted. Furthermore, it was observed that male and female subjects used a rather different set of neural correlates when processing faces showing either happy or sad expressions. This was more noticeable when they were processing faces portraying sad emotions than happy emotions. Our findings provide some preliminary support for the speculation that the two genders may be associated with different areas of brain activation during emotion recognition of happy or sad facial expressions. This suggests that the generalizability of findings in regard to neural correlates of facial emotion recognition should consider the gender of the subjects. q 2002 Elsevier Science Ireland Ltd. All rights reserved. Keywords: Facial emotion recognition; Happy; Sad; Gender; Emotion
It was speculated that right hemisphere specialization accounted for the processing of emotion information (e.g. [13]). Subsequent clinical, electroencephalograph, and imaging data, however, have suggested that the recognition and processing of positive and negative emotions may be lateralized over both hemispheres (e.g. [3,5]). Furthermore, Northoff et al. [15] studied functional dissociation between the prefrontal cortex during emotional processing of five men and five women and observed dissociable neural circuits for processing negative and positive emotions. Blair et al. [1] used positron emission tomography (PET) to study dissociable
* Corresponding author. Tel.: 1852-2857-8394; fax: 1852-25408920. E-mail address:
[email protected] (T.M.C. Lee).
neural responses to facial expressions of sadness and anger and suggested the existence of dissociable, but interlocking, systems for the processing of distinct categories of negative facial expressions. Phillips et al. [16] recruited seven men and one woman to study happy and sad facial expression perception using functional magnetic resonance imaging (fMRI). They observed a signal increase predominately in the left anterior cingulated gyrus, bilateral posterior cingulated gyri, the medial frontal cortex, and the right supramarginal gyrus; brain regions previously implicated in visuo-spatial and emotion processing tasks. However, no brain regions showed increased signal intensity during the presentation of sad facial expressions. Therefore, the existing evidence seems to suggest that there are certain distinct sets of neural correlates for processing different categories of emotions. Nonetheless, reaching a
0304-3940/02/$ - see front matter q 2002 Elsevier Science Ireland Ltd. All rights reserved. PII: S03 04 - 394 0( 0 2) 00 96 5- 5
T.M.C. Lee et al. / Neuroscience Letters 333 (2002) 13–16
14
Table 1 Regions of activation and laterality indexes when the male and female subjects were viewing photographs portraying happy or sad facial expressions a Happy M L BA LI
7,13,40 0.3788
Sad F
R
L
3,4,6
6,7,9,13,40,Th
M R
L
7,18,19,22,39
10
0.2122
R 10,38, Len 2 0.4186
F L
R
40,Len 0.2293
17
a Happy, faces portraying happy emotions; sad, faces portraying sad emotions; M, male subjects; F, female subjects; L, left; R, right; BA, Broadmann’s area (regions of activation, P , 0:05, corrected); Th, thalamus; Len, lentiform nucleus; LI, laterality indexes expressed as (L minus R) volume of activation divided by (L plus R) volume of activation.
conclusion about the laterality of positive and negative emotions has remained elusive, and contradictory findings have been reported (e.g. [3,5,16]), which may be due to the methodological differences across studies [2]. One significant discrepancy observed was the different gender composition of the samples used in these studies, which may contribute significantly to the discrepant findings of the studies. Women are more prone to suffering from affective disorders. This implies that emotional processing in men and women may be different. Wild et al. [19] reported that female subjects were more susceptible than male subjects to emotion contagion. These observations also suggest that men and women process emotions differently. Kesler-West et al. [7] used fMRI to investigate explicit processing of facial emotions including happiness and sadness. They observed that men showed greater left hemispheric activation when observing sad than happy faces. No such differences between the emotions in either hemisphere existed among women. Lane et al. [10] used PET to study the neural correlates of pleasant and unpleasant emotions and observed both common and unique components of the neural networks mediating the emotions in healthy women. We speculated, therefore, that gender may correlate with neural activities of the brain during emotion recognition, and used fMRI to examine the effect of gender on the volume and areas of brain activation during the viewing of alternating sets of faces of happy and sad emotions. For the fMRI study, 24 volunteers, 12 men and 12 women (age range: 20–26 years), with academic qualifications up to postgraduate level, were monitored using a block fMRI design during the viewing of all three types of facial emotions: happy, sad, and neutral. Informed consent was obtained from all the subjects after the nature of the study was explained. Twelve photographs of Japanese adults with higher validity (as demonstrated by the obtained agreement level in a previous study conducted by Matsumoto and Ekman [11]) were selected from Matsumoto and Ekman’s [11] Pictures of Facial Affect. Four of these pictures depicted happy facial emotions, four sad, and four neutral. Half of the photographs were of men and the other half were of women. These photos were arranged in pairs in the experimental conditions, one showed the happy or sad facial
emotion and the other portrayed the neutral facial emotion. The subject was to indicate which of the pair of photos portrayed the target emotion as instructed, happy or sad, by pressing the response buttons. In the control condition, pairs of photos of neutral facial expressions were presented. The subjects were simply required to look at the photos. We did not require the subjects to press any button during the control condition because we wanted to minimize the effect of mental fatigue of the subjects. The experiment was performed on a 1.5 T Magnetom Vision MRI scanner (Siemens, Erlangen, Germany) at the Chang Gung Memorial Hospital. The stimuli were shown through a goggle display system (Resonance Technology Inc., CA, USA). Prior to the MR imaging, the subject was visually familiarized with the procedures and the experimental conditions to minimize anxiety and enhance task performance. Following this familiarization, the subject lay supine on the scanning table and was fitted with plastic ear-canal molds. The subject’s head was immobilized by a tightly fitting, thermally molded, plastic facial mask that extended from the hairline to the chin [6]. A single-shot T2*-weighted gradient echo planar imaging sequence was used for the fMRI scans; the slice thickness ¼ 5 mm, the inplane resolution ¼ 3.3 £ 3.3 mm, and scanning acquisition time/echo time TR/TE/u ¼ 3000 ms/60 ms/908. The field of view was 211 £ 211 mm, and the acquisition matrix was 64 £ 64 mm. Twenty-four contiguous axial slices were acquired to cover the whole brain. The anatomical MRI was acquired using a T1-weighted, three-dimensional, gradient-echo pulse-sequence. This sequence provided high-resolution (1 £ 1 £ 1 mm 3) images of the entire brain. There were two experimental conditions, happy or sad facial expression, and each block represented one of the two conditions. In each block, the subject was first presented with the instruction for 2 s, then visual fixation on a cross hair for 1 s, then a pair of photos for 3 s, followed by visual fixation on a cross hair for 1 s. With six pairs of photos in each block, three pairs were of men and three pairs were of women, the total duration per condition was 27 s. Each experimental condition was repeated three times and the order of presentation was counter-balanced. We used Matlab (The Math Works, Inc., Natick, MA) and
T.M.C. Lee et al. / Neuroscience Letters 333 (2002) 13–16
15
Fig. 1. Functional maps: normalized activation brain maps averaged across subjects demonstrating the statistically significant activations (P , 0:05, corrected) when the male (n ¼ 12) and female (n ¼ 12) subjects were viewing photographs portraying happy or sad facial expressions. Planes are axial sections, labeled with the height (mm) relative to the bicommissural line. L, left hemisphere; R, right hemisphere; (a) Happy facial expressions; (b) sad facial expressions.
in-house software for image data processing [20]. Each subject’s raw data were spatially smoothed by convolution with a three-dimensional, two-voxel (6.6 mm) Full Wave Half Magnitude (FWHM) Gaussian kernel, and motion was corrected with a six-parameter, rigid-body algorithm using MEDx (Sensor System, Inc., Sterling, VA). Skull stripping of the three-dimensional MRI T1-weighted images was done using Alice (Perceptive Systems, Inc., Boulder, CO) and MEDx. These images were then spatially normalized to the Talairach brain atlas using the Convex Hull algorithm [8]. The subjects in this study did not commit any errors during the experimental tasks. The volume and pattern of activations were studied to understand the effect of gender on the processing of happy and sad facial emotions. The laterality indexes (Table 1) indicate that relatively more left hemisphere activation was associated with viewing pictures portraying happy facial expressions by both male and female subjects. However, when viewing faces depicting sad emotions, more left hemisphere activation for the female subjects and more right hemisphere activation for the male subjects was observed. Our findings suggest that laterality of facial emotion processing is gender and emotion specific, which may explain the contradictory findings of laterality models of emotion processing reported in the literature. Nagae and Moscovitch’s [14] reported that explicit memory for emotional words was dependent more on the right hemisphere. Whether this reported phenomenon is also gender and emotion specific needs to be verified in future research.
Regarding the pattern of activation (Table 1 and Fig. 1), when the happy faces were being viewed, bilateral frontal and left parietal activation was observed in both male and female subjects. However, the female subjects showed left thalamic activation and right occipital and temporal activation that were not observed in the male subjects. When the sad faces were being viewed, the patterns of activation of the male and female subjects were very different. The male subjects demonstrated bilateral frontal, right temporal, and right lentiform activation; whereas the female subjects showed left parietal, left lentiform, and right occipital activations. It seemed that viewing faces depicting sad emotions did not bring about significant frontal activity in the female subjects. Regarding the regions of activation noted in this study, we observed the activation of right Broadmann’s area (BA) 38 in the male subjects when viewing sad faces, which was consistent with the report of Blair et al. [1] who used male subjects in their study. This may relate to the cueing of affect-laden autobiographical materials to facilitate recall of negative emotions observed in some behavioral studies (e.g. [12]). Interestingly, this mechanism of emotional cueing was not observed when happy faces were being viewed, which provides further evidence to support the speculation of dissociable neural mechanisms for processing happy and sad facial emotions. Furthermore, activation of left BA 40 in the female subjects when viewing both happy and sad faces was consistent with the report of Phillips et al. [16] who used both male and female subjects in their study. The activation of the left supramarginal gyrus
16
T.M.C. Lee et al. / Neuroscience Letters 333 (2002) 13–16
led to the speculation that this part of the brain played a role in assisting right hemisphere during demanding visuospatial tasks [17]. The activation of left BA 9 and the thalamus of female subjects reported in some previous studies [10] was also observed in this study. However, some regions of activation reported previously (e.g. [1,9,16]) were not observed this time. This discrepancy is likely related to methodological differences since we used prototypes rather than morphed pictures of varying degrees of emotional expressions. For example, we did not observe the activation of the cingulate region (e.g. [1]) when the subjects were viewing sad faces, which may be due to the lesser demand on attention on the experimental task used in our study. Similarly, the absence of significant activation of the amygdala (e.g. [1,18]) in our study may also relate to the different experimental stimuli used. This study is one of a very few that have studied gender differences in the processing of happy and sad facial emotions. Consistent with previous findings, distinct sets of neural correlates for processing happy and sad facial emotions were noted. Furthermore, it was observed that male and female subjects used a rather different set of neural correlates when processing faces of happy or sad expressions. This was more noticeable when faces portraying sad emotions were being processed (Table 1 and Fig. 1). Only when the male and female subjects were viewing and processing faces depicting happy expressions were some common regions of activation (namely BA 7, 13, and 40 in the left hemisphere) observed. Canli et al. [4] reported that men and women different in the neural networks engaged during emotional experience and memory encoding. Whether men and women used different cognitive strategies for judging emotion expressions is a topic worth of research in the future. [1] Blair, R.J.R., Morris, J.S., Frith, C.D., Perrett, D.I. and Dolan, R.J., Dissociable neural responses to facial expressions of sadness and anger, Brain, 122 (1999) 883–893. [2] Canli, T., Hemispheric asymmetry in the experience of emotion: a perspective from functional imaging, Neuroscientist, 5 (1999) 201–207. [3] Canli, T., Desmond, J.E., Zhao, Z., Glover, G. and Gabrieli, J.D.E., Hemispheric asymmetry for emotional stimuli detected with fMRI, NeuroReport, 9 (1998) 3233–3239. [4] Canli, T., Desmond, J.E., Zhao, Z. and Gabrieli, J.D.E., Sex differences in the neural basis of emotional memories, Proc. Natl. Acad. Sci. USA, 99 (2002) 10789–10794. [5] Davidson, R.J., Cerebral asymmetry, emotion, and affective style, In R.J. Davidson and K. Hugdahl (Eds.), Brain Asymmetry, MIT Press, Cambridge, MA, 1995, pp. 361–387. [6] Fox, P.T., Perlmutter, J.S. and Raichle, M.E., A stereostatic
[7]
[8]
[9]
[10]
[11]
[12]
[13]
[14]
[15]
[16]
[17]
[18]
[19]
[20]
method of anatomical localization for positron emission tomography, J. Comput. Assist. Tomogr., 9 (1985) 141–153. Kesler-West, M.L., Andersen, A.H., Smith, C.D., Avison, M.J., Davis, C.E., Kryscio, R.J. and Blonder, L.X., Neural substrates of facial emotion processing using fMRI, Cog. Brain Res., 11 (2001) 213–226. Lancaster, J.L., Fox, P.T., Downs, H., Nickerson, D.S., Hander, T.A., Mallah, M.E., Kochunov, P.V. and Zamarripa, F., Global spatial normalization of human brain using convex hulls, J. Nucl. Med., 40 (1999) 942–955. Lane, R.D., Fink, G.R., Chau, P.M. and Dolan, R.J., Neural activation during selective attention to subjective emotional responses, NeuroReport, 8 (1997) 3969–3972. Lane, R.D., Reiman, E.M., Bradley, M.M., Lang, P.J., Ahern, G.L., Davidson, R.J. and Schwartz, G.E., Neuroanatomical correlates of pleasant and unpleasant emotion, Neuropsychologia, 35 (1997) 1437–1444. Matsumoto, D. and Ekman, P., Japanese and Caucasian Facial Expressions of Emotion (JACFEE) and Neutral Faces (JACNeuF), [CD-Rom], Intercultural and Emotion Research Laboratory, Department of Psychology, San Francisco State University, San Francisco, CA (1988). Mineka, S. and Cook, M., Mechanisms involved in the observational conditioning of fear, J. Exp. Psychol. Gen., 122 (1993) 22–38. Morris, R.D. and Hopkins, W.D., Perception of human chimeric faces by chimpanzees: evidence for a right hemisphere advantage, Brain Cogn., 21 (1993) 111–122. Nagae, S. and Moscovitch, M., Cerebral hemisphere differences in memory of emotion and non-emotion words in normal individuals, Neuropsychologia, 40 (2002) 1601–1607. Northoff, G., Richter, A., Gessner, M., Schlagenhauf, F., Fell, J., Baumgart, F., Kaulisch, T., Kotter, R., Stephan, K., Leschinger, A., Hagner, T., Bargel, B., Witzel, T., Hinrichs, H., Bogerts, B., Scheich, H. and Heinze, H.J., Functional dissociation between medial and lateral prefrontal cortical spatiotemporal activation in negative and positive emotions: a combined fMRI/MEG study, Cereb. Cortex, 10 (2000) 93–107. Phillips, M.L., Bullmore, E.T., Howard, R., Woodruff, P.W.R., Wright, I.C., Williams, S.C.R., Simmons, A., Andrew, C., Brammer, M. and David, A.S., Investigation of facial recognition memory and happy and sad facial expression perception: an fMRI study, Psychiatry Res., 83 (1998) 127–138. Smith, I., Jonides, J. and Koeppe, R.A., Dissociating verbal and spatial working memory using PET, Cereb. Cortex, 6 (1996) 11–20. Whalen, R.J., Rauch, S.L., Etcoff, N.L., McInerney, S.C., Lee, M.B. and Jenike, M.K., Masked presentations of emotional facial expressions modulate amygdale activity without explicit knowledge, J. Neurosci., 18 (1998) 411–418. Wild, B., Erb, M. and Bartels, M., Are emotions contagious? evoked emotions while viewing emotionally expressive faces: quality, quantity, time course and gender differences, Psychiatry Res., 102 (2001) 109–124. Xiong, J., Gao, J.H., Lancaster, J.L. and Fox, P.T., Cluster pixels analysis for functional MRI activation studies in the human brain, Hum. Brain Mapp., 3 (1995) 209–223.