A continuous emotional task activates the left amygdala in healthy volunteers: 18FDG PET study

A continuous emotional task activates the left amygdala in healthy volunteers: 18FDG PET study

Available online at www.sciencedirect.com Psychiatry Research: Neuroimaging 171 (2009) 199 – 206 www.elsevier.com/locate/psychresns A continuous emo...

321KB Sizes 1 Downloads 49 Views

Available online at www.sciencedirect.com

Psychiatry Research: Neuroimaging 171 (2009) 199 – 206 www.elsevier.com/locate/psychresns

A continuous emotional task activates the left amygdala in healthy volunteers: 18 FDG PET study Emilio Fernandez-Egeaa,⁎, Eduard Parelladaa,d , Francisco Lomeñab,d , Carles Falconc,d,e , Javier Paviab,d,e , Anna Manea , Gisela Sugranyesa , Manuel Valdesa,d , Miguel Bernardoa,d a

Hospital Clinic Schizophrenia Program (PEC), Department of Psychiatry, Institute of Neuroscience, Hospital Clinic, Barcelona, Spain b Nuclear Medicine, Institut de Diagnostic per Imatge, Hospital Clinic, Barcelona, Spain c Unitat de Biofísica i Bioengineria, Facultat de Medicina, Universitat de Barcelona, Barcelona, Spain d Institut d'Investigacions Biomèdiques Augusti Pi i Sunyer (IDIBAPS), Barcelona, Spain e Centro de Investigación Biomédica en Red en Bioingenieria, Biomateriales y Nanomedicina (CIBER-BBN), Spain Received 14 May 2007; received in revised form 17 December 2007; accepted 12 January 2008

Abstract Human amygdalar activation has been reported during facial emotion recognition (FER) studies, mostly using fast temporal 18 FDG PET technique has never been previously applied to FER studies. resolution techniques (fMRI, H15 2 O PET or MEG). The We decided to test whether amygdala response during FER tasks could be assessed with this technique. The study was conducted in 10 healthy right-handed volunteers who underwent two scans on different days in random order. Content of the tasks was either emotional (ET) or neutral (CT) and lasted for 17 ½ min. Three SPM2 analyses were completed. The first, an ET-CT contrast, showed left amygdalar activation. The second ruled out order effect as a confounder factor. Finally, the whole brain contrast showed activation of the emotional recognition-related areas. Time responses and errors indicated high rates of accuracy in both tasks. We discuss the results and the role of habituation phenomena and the possibility of applying this technique to samples of patients with psychiatric disorders. In conclusion, our study reveals left amygdalar activation assessed with FDG PET, as well as other major emotion recognition-related brain areas during FER tasks. © 2008 Elsevier Ireland Ltd. All rights reserved. Keywords: Positron emission tomography; Nuclear medicine; Limbic system; Hemispheric specialization

1. Introduction The functional neuroimaging era has seen the progressive inclusion of different activation paradigms in the ⁎ Corresponding author. Servei de Psiquiatría (G096), Institut Clínic de Neurociències, Hospital Clínic, C/Villarroel, 170, 08036 Barcelona, Spain. Tel.: +34 93 227 55 47; fax: +34 93 227 55 48. E-mail addresses: [email protected], [email protected] (E. Fernandez-Egea).

emotional perception circuit (Reiman et al., 1997; Zald and Pardo, 1997; Buchanan et al., 2000; Zalla et al., 2000), and helped to define the brain structures that play a role in the neurobiology of emotion. These brain areas include the amygdala, hippocampus, insula, anterior cingulate cortex, ventral striatum and orbitofrontal cortex (Dolan, 2002; Gur et al., 2002b; Phillips et al., 2003a; Calder and Young, 2005), as well as other areas that play more minor roles. A summary of the activation paradigms published reveals a general consensus that facial emotion recognition (FER)

0925-4927/$ - see front matter © 2008 Elsevier Ireland Ltd. All rights reserved. doi:10.1016/j.pscychresns.2008.01.003

200

E. Fernandez-Egea et al. / Psychiatry Research: Neuroimaging 171 (2009) 199–206

tasks act as the most powerful trigger for activation of the emotion network (Hariri et al., 2002). Some studies have also focused on which brain structure shows the greatest activation during FER paradigms, and the amygdala has been identified (Gur et al., 2002b; Williams et al., 2004). Indeed, during recent years, several studies have limited their analysis of FER effects to the amygdala (Fitzgerald et al., 2006), although reliance on a region of interest (ROI) approach in neuroimaging studies remains controversial (Friston et al., 2006; Saxe et al., 2006). Researchers have summarized the functions of the human amygdala in emotion and vigilance (Davis and Whalen, 2001), on the basis of its role in automatic evaluation of danger (Zald and Pardo, 1997), emotion recognition (Adolphs et al., 1994) and novelty detection (Wright et al., 2003). A lateralized specialization has also been suggested, where novelty awareness would be entrusted to the right side and emotion recognition to the left (Tulving et al., 1994; Martin, 1999). Specificity of the amygdala in fear emotion recognition was initially suggested (Adolphs et al., 1994), despite current research supporting reactivity of the amygdala, as well as the above-mentioned limbic and extra-limbic structures, in response to multiple expressions of facial affect (Gur et al., 2002b; Phan et al., 2002; Yang et al., 2002; Fitzgerald et al., 2006). As part of this emotional-vigilance network, the amygdala presents extensive bidirectional connections with the other limbic and extralimbic structures (Minzenberg et al., 2007). Despite the wealth of research published in this area, there are no studies reporting amygdalar activation with low time resolution techniques. All FER studies with functional brain imaging have been performed with high temporal resolution techniques, such as fMRI (Gur et al., 2002c; Abel et al., 2003; Takahashi et al., 2004), MEG (Streit et al., 2003) or H215O positron emission tomography (PET) (Phan et al., 2002), which allow the detection of rapid changes in amygdala activity. The other major functional neuroimaging technique, 18[F]fluorodeoxyglucose (FDG) PET, has not been used for FER studies so far. The pharmacokinetics of FDG distinguish it from other faster time resolution brain imaging techniques, such as fMRI, MEG or H215O PET. FDG includes a 30-min uptake period and continuous emission for almost 120 min after administration. The metabolism of the tracer stops during the glycolytic pathway and essentially remains trapped in the area of active metabolism, thus allowing evaluation of the degree of accumulative activation of brain areas for a period of almost 30 min and obtaining a single image per scan. The FDG technique would be more useful than high temporal resolution techniques for emotion tasks that require practice, and the judgment of emotions has an

emotional concomitant that builds over minutes. It would represent sustained brain activation and would be directly proportional to brain work. Some reports in non-emotional tasks suggest that the amygdala can be assessed by FDG PET (Grant et al., 1996; London et al., 1996; Bonson et al., 2002; Rilling et al., 2004) and that the amygdala may respond in a sustained manner (Zald, 2003). Thus, the specific aim of this study is to assess whether, in a sample of healthy volunteers and using FDG PET technology, it is feasible to study amygdalar activation during a continuous FER task. 2. Methods 2.1. Subjects Ten right-handed young men (age range = 23–31), without current or past history of any psychiatric (including substance abuse) neurological or major medical conditions, participated in the study. Subjects were evaluated with the Spanish versions of both the SCID-I for DSM-IV psychiatric disorders and the Calgary Depression Scale (Sarro et al., 2004), Spanish version. All volunteers provided written consent as approved by the local Institutional Review Board. 2.2. Facial stimuli Face pictures were selected from a set of 175 black and white pictures of amateur actors in an evoked-emotion performance. These pictures were created by the Brain Behavior Laboratory (University of Pennsylvania) and have been validated for this type of study (Gur et al., 2002a). Selected pictures included male/female faces with neutral, sad or happy expressions. All of them have the same size, luminosity and visual characteristics. 2.3. Procedure Participants were seated comfortably 1 m in front of a 15-inch screen laptop (Samsung X15plus). SuperLab Pro® (Cedrus, version 2.0.4) displayed the pictures and collected the behavioral data (responses and response times, in ms). Subjects held an optical two-button mouse (Logitec®) with both hands, using either the left or the right thumb for answering. 2.3.1. Tasks Participants performed two different tasks on two different days. Each task consisted on rating 300 pictures (displayed for 3.5 s) during a total of 1050 s of continuous task. Subjects were instructed to be sure about their

E. Fernandez-Egea et al. / Psychiatry Research: Neuroimaging 171 (2009) 199–206

responses before answering as fast as possible. The Emotional Task (ET) consisted of three series of 50 happy and 50 sad faces (men and women in equal number, randomly displayed). The left button defined “sadness” and the right button “happiness”. For the Control Task (CT), a gender-discrimination task was chosen, in which neutral expression pictures of 50 men and 50 women were displayed three times in exactly the same order as in the ET. The right button defined “man" and the left button “woman”. Tasks were performed randomly, equally balanced, where five subjects started with the ET (Fig. 1). Before performing the tasks, subjects were trained on a similar 10-face task so that they could avoid procedural errors. After finishing the task, subjects rested for 10 min before the scan was performed.

201

2.3.2. Radiotracer administration and PET imaging Before performing the task, capillary blood glucose was obtained in order to exclude hyperglycemia. A peripheral intravenous line was used to administer 8– 10 mCi (296–370 MBq) of FDG radiotracer, 60 s after starting the task. Siemens Biograph ECAT BGO PET/CT without pico3D acquired images 30 min after radiotracer administration. A standard 11-min “HEAD/BRAIN” routine was performed (1 min for transmission and 10 min of emission). Thirty-five tomographic brain sections (oblique, sagittal and coronal), attenuationcorrected, were obtained (2.47 mm slice thickness). Reconstruction was performed with the OSEM algorithm (16 subsets and 6 iterations) with a matrix of 128 × 128 × 64 and 2.6 mm3 voxel size.

Fig. 1. The final sample of 10 patients was divided in two, in order to control the order effect and the role of novelty detection in amygdalar activation. As the diagram shows, five performed the first scan (T1) with the emotional task (ET) and the other five with the control task (CT). During the second scan (T2) the alternate task was performed. The ET displayed equally balanced faces of men or women with either sad or happy expressions, while the CT also showed either men or women with expressionless faces.

202

E. Fernandez-Egea et al. / Psychiatry Research: Neuroimaging 171 (2009) 199–206

2.4. Image analysis MRIcro® v1.39 was used to reconstruct the 35 DICOM images per subject per scan. Postprocessing of the reconstructed images was performed by using SPM2 tools (Wellcome Department of Cognitive Neurology, London; http://www.fil.ion.ucl.ac.uk/spm), running with MATLAB® 7.0.1. First, each image was manually reoriented to the SPM PET template. Then, each pair of ETCT scans was realigned and spatially normalized to the Montreal Neurological Institute (MNI) template implemented in SPM2. Finally, images were smoothed with a Gaussian kernel of 5 mm of Full-Width Half-Maximum (FWHM). We selected 5 mm due to the small sample of the amygdala. Each single pre-normalized and normalized image was also visually inspected and compared with the SPM PET template. Given the design of this study, with two conditions per subject and one scan per condition, statistical analysis was carried out using a paired t-test. Both standard proportional scaling for each image to 100 and a threshold (relative global) of 0.5 were applied, without sphericity correction. Left and right amygdala WFUPickatlas (Maldjian et al., 2003) masks for the ROIs were selected. The initial hypothesis included two a priori contrasts. Table 1 describes the contrasts performed. To assess whether amygdala selectively respond to emotions, an ET minus CT contrast was carried out, focusing on specific activation related to the emotional content of the pictures. Maps of statistical significance of ROIs for this contrast were created, with a statistical significance threshold of P b 0.05 and without fixing a threshold for a minimum number of voxels per cluster. For completeness, we evaluated the whole brain with the same conditions (P b 0.005), but with an activation minimum voxel threshold of 10, so as to avoid type 1 errors. The third contrast carried out includes assessment of activation due to order-effect detection. In this case, an order effect could have been due not only to the task content itself, but also to other circumstances such as the radiotracer injection, the PET machine or even the new environment. Five volunteers followed an ET–CT order, Table 1 Summary of the three contrasts performed, with the brain areas involved, the P level of significance applied and the main results Contrast Region ET–CT ET–CT T1–T2

P significance level

ROI amygdala P b 0.005 (L–R) Whole brain P b 0.005 ROI amygdala (L–R) P b 0.1

Results Left amygdalar activation Table 2 No activation

and five volunteers a CT–ET order. Then, using the ROI approach, the first performance of any task (T1 in Fig. 1) was contrasted with the second performance (T2). In order to guarantee that novelty did not act as a confounder factor, a trend-level significance level (P b 0.2, no voxel threshold) was used, which showed no amygdalar activation. This non-statistically significant P, less restricted than 0.05, was chosen in order to avoid type 2 errors. In the case of a possible activation due to novelty detection, not shown because of the statistical power, we used this trend level of significance level to evaluate whether the lack of differential activation could be resolved with a larger sample. 2.5. Statistical analysis Descriptive, t-Student, ANOVA and χ2 statistics for time response and accuracy were done using SPSS v12.0. 3. Results The study was carried out with a final sample that included 10 right-handed men. There was 1 omission and there were 5999 valid responses. No unusual circumstances appeared likely to have influenced the performance of the tasks. Average time between the two scans was 36.20 days (S.D. 28.90; range 7–96). All subjects rated 0 on the Calgary Depression Scale. 3.1. Behavioral results First, we assessed whether both tasks had been correctly performed and whether we could be confident about the validity of the results. Percentage of accuracy was very high, both for emotional and for control tasks, showing that volunteers had been attentive to the tasks. The ET reached 97.8% and the CT reached 98.7%. This difference was not statistically significant (χ2 = 5.293, P = 0.056), indicating that the tasks had been carried out correctly. Next, we evaluated response time (RT) in milliseconds (ms) for both tasks. We excluded RTs below 100 ms, assuming that the answer had been given too rapidly. In total, we excluded 3 (2 CT and 1 ET) out of 5999 valid answers. We observed that RT was different for the two tasks: average RT for the ET was 903.44 ms (S.D. 403.72) and average RT for the CT was 665.50 ms (S.D. 238.37). This difference was statistically significant (t = 31.022, P b 0.001). According to our data, RT of the ET was 237.93 ms (IC 95% 222.89–252.97 ms) longer than RT of the CT. To assess participants' efforts, we used the

E. Fernandez-Egea et al. / Psychiatry Research: Neuroimaging 171 (2009) 199–206

efficiency index (Gur et al., 2002b), which provides a measure of performance that balances “power” with “speed” and is defined by number of correct answers/log (RT). The difference of efficiency between correct responses per second for the ET (0.99; S.D. 0.04) and CT (1.05; S.D. 0.02) was statistically significant (U = 14.00, P = 0.005), suggesting a higher effort during the ET.

203

was ET or CT, and found no significant activation of the amygdala, despite the use of a more sensitive trend level of significance (P b 0.2) and no threshold of minimum activated voxels. Thus, we conclude that the amygdala activation shown previously is exclusively emotionrelated. 4. Discussion

3.2. Maps of statistical significance Three different contrasts were performed: two a priori hypothesis-driven contrasts focused on amygdalar ROI and one post-hoc analysis of the whole brain. Table 1 summarizes these contrasts and the main findings. The first a priori contrast (ET–CT; ROI approach, P = 0.05) indicates that, in relation to the control task, emotionally relevant stimuli significantly activate left amygdala (76 voxels activated in [− 18, − 8, − 16], t = 5.75, P b 0.001). Right amygdala showed a single voxel activated ([22, −6, −14], t = 3.29, P = 0.005). These results suggest left amygdala specific emotion-content recognition reactivity. Since this was a hypothesisdriven study using ROIs, other brain structures that were highly activated during the ET task could have been missed. In order to explore this possibility, we performed a post-hoc analysis involving the whole brain but changing the P and minimum voxel thresholds to P b 0.005 and 10 activated voxels in order to avoid type 1 errors. Table 2 presents the results of this contrast, which shows activation of commonly cited emotion-related areas and, again, suggests the predominant role of the left amygdala. However, it is widely accepted that the amygdala also responds to novelty or to the different order of tasks. Therefore, in order to confirm its emotion specific recognition reactivity as reported above, we carried out a second contrast, which allowed us to rule out order effect as responsible for the activation, rather than the emotioncontent. We contrasted first versus second performance (T1–T2 in Fig. 1) of any task, independently of whether it Table 2 Brain areas activated during the ET–CT contrast Region

Coordinates

Left

− 26, 10,− 16 − 18, − 8, − 16 − 14, 0, − 28 38, 12, −26 48, − 24, − 26 18, − 6, 46

Insula Amygdala Parahippocampal gyrus Right Temporal pole superior Fusiform gyrus Cingulate mid

Cluster size Z score 40 30 55 23 19 13

4.14 3.68 3.41 4.05 4.07 3.18

Montreal Neurological Institute Coordinates (x, y, z), cluster sizes and Z score for peak activations are shown for each corresponding area.

The main finding of this study is the activation of left amygdala during emotional stimuli recognition, assessed by FDG PET. This confirms our initial hypothesis of the ability of the FDG PET technique to evaluate amygdalar activation during continuous emotion recognition tasks. The study design allowed us to rule out the effect of order of task presentation as a cause of this activation, as the order of task performance did not influence the results (T1 minus T2 contrast). However, amygdalar activation alone is not sufficient to explain the phenomena of emotion recognition, which needs to take into consideration other components of the cortical integrative network. Therefore, the fact that we have also detected activation of other emotion recognition-related areas in the brain acts as a further validation of our results. Activation of the left insula, left parahippocampal gyrus, right fusiform gyrus, right temporal superior cortex and cingulate gyrus has been repeatedly reported in neuroimaging and FER studies (Dolan, 2002; Phillips et al., 2003a; Calder and Young, 2005). Behavioral data indicate that CT requires less effort than ET, even when accuracy was similar in both tasks, because of the greater time and effort required to evaluate emotions (measured as RT and efficiency index) when confronted with neutral expressions. In fact, increases in the range of 240 ms in emotional RTs are also congruent with MEG studies, which have reported a 180 ms delay in facial recognition in relation to other recognition tasks (Liu et al., 2002). Therefore, it is likely that emotion recognition requires more time, and effort, to process information than sex recognition (Whalen et al., 2001). According to these behavioral and neuroimaging results, the amygdalar activation reported could be mainly attributed to its function of emotion recognition. The role of habituation phenomena in our results requires further explanation. Amygdalar habituation has been reported in neuroimaging studies after repetitive exposure to visual complex scenes (Fischer et al., 2000) and neutral (Breiter et al., 1996) or emotional (Wright et al., 2003) facial expressions. However, two previous FDG PET studies reported amygdalar activation while subjects were exposed to complex visual stimuli (Grant et al., 1996; London et al., 1996; Bonson et al., 2002) and

204

E. Fernandez-Egea et al. / Psychiatry Research: Neuroimaging 171 (2009) 199–206

a non-human primate study also reported left amygdalar hyperactivation (Rilling et al., 2004). Nevertheless, our results of amygdala activation confirm most of the accumulated background data about habituation (Breiter et al., 1996; Wedig et al., 2005). Although there is no consensus as to which side of the amygdala is responsible for facial emotion recognition (Baas et al., 2004), in the case of habituation, only the right amygdala has been repeatedly linked to this phenomenon in young healthy volunteers (Wright et al., 2001; Wedig et al., 2005). It has been suggested that this hemispheric specialization could be due to different rates of habituation of both left and right amygdala (Wright et al., 2001) or because right-sided hemispheric structures focus on novelty and left-sided structures focus on emotion recognition (Tulving et al., 1994). Our results support both of these complementary hypotheses, and suggest that there is a slower decrease of activation of left amygdala during FER. Left amygdala would be entrusted with emotion recognition and would have a slower habituation rate. Current research suggests a sex-related hemispheric lateralization of amygdalar function to memory-related emotional material (Mackiewicz et al., 2006; Takahashi et al., 2006), even in FDG PET studies (Cahill et al., 2001). Inversely to our results, men tend to show greater activation of right amygdala whereas women tend to show a greater activation of left amygdala when carrying out memory encoding tasks of emotional material. One possible explanation for our divergent results could be derived from the different tasks that we used, both due to the tasks themselves (recognition rather than memory task) and to their content (facial versus nonfacial pictures). Indeed, other FDG PET studies in non-human primates only observed left amygdalar activation (Rilling et al., 2004). In any case, none of these explanations is completely satisfactory and this particular point will remain unresolved until a specific research project focused on these sex differences is carried out. Sample size could have acted as a limitation for the present study. Therefore, we decided to select a highly homogeneous group of healthy, young, right-handed men and designed a hypothesis-driven study with only two ROIs – both left and right amygdala – rather than the whole brain in order to avoid multiple comparisons and so as to reduce type 1 errors. Moreover, task and study design were carefully chosen, and the recommendations for facial emotion recognition studies (Edwards et al., 2002), which included low cognitive demand, two conditions per task and limited time of exposure, were all followed. We also decided to display emotional or neutral faces rather than other amygdalar activation

paradigms, since there is consensus that facial emotion is the greatest trigger of amygdala activation (Hariri et al., 2002). Expressionless faces for the CT were selected in order to avoid “unconscious” evaluation of emotion that could activate amygdala (Sheline et al., 2001). Both tasks were highly comparable, except for the emotional content. On the other hand, the highly restrictive inclusion criteria could limit the generalization of the results, as only right-handed males were included. We think that further studies with larger and more heterogeneous samples will be necessary to confirm our results. The content of the emotional task, including only sad and happy faces, would also limit the brain areas activated. Different areas are activated when anger/sad/ fear faces are displayed. However, the amygdala is always activated and it was the focus of the study, not the other brain areas. What this study really indicates is that the left amygdala can be activated for a longer period of time if stimuli are relevant enough, as in the case of emotion recognition. In this sense, the efficiency index reflects the greater degree of effort during the ET. Moreover, the activation is sufficiently powerful to be measured with FDG. One benefit of FDG over other faster temporal resolution techniques would be in offering a way to study trait characteristics in mental disorders. Most current work in emotion in this group of subjects involves direct comparison between different tasks of recognition/evocation of particular emotions (Phillips et al., 2003b; Silver et al., 2002). Several major mental disorders have complex emotional disturbances (Whittaker et al., 2001), which could explain why results of facial emotion recognition studies still remain unclear. For instance, comparative neuroimaging between schizophrenic patients and healthy controls has reported amygdalar hypoactivation (Gur et al., 2002c), “normo”activation (Taylor et al., 2002) or even hyperactivation (Kosaka et al., 2002). It has recently been suggested that a lack of normal habituation (Holt et al., 2005; Holt et al., 2006) – reported as hyperactivation – could explain these contradictory results in schizophrenic patients. In this sense, FDG PET, which assesses the accumulative activation, could help to elucidate the role of amygdalar activation in schizophrenic patients. A similar reasoning could be applied to social phobia, depression or bipolar disorder, among other neuropsychiatric disorders. A collateral finding can also be included in the present study as, to our knowledge, this is the first study using this face dataset that has been performed on a Spanish sample. The high rates of correct attribution of emotion (97.8%) and gender (98.7%) make it applicable to the Spanish population.

E. Fernandez-Egea et al. / Psychiatry Research: Neuroimaging 171 (2009) 199–206

Acknowledgements The authors thank Rosa Aragones for her help, Alejandra Bruna for assisting with the text and Drs. R.C. and R.E. Gur from Brain Behavior Laboratory for allowing us to use their pictures. The study was supported by “Premi Fi de Residencia 2003” from the Hospital Clinic (Barcelona) and, in part, by the Fondo de Investigaciones Sanitarias (FIS G03/185) and the Spanish Ministry of Health, Instituto de Salud Carlos III, Red de Enfermedades Mentales RD06/0011/006 (REMTAP Network). References Abel, K.M., Allin, M.P., Kucharska-Pietura, K., David, A., Andrew, C., Williams, S., Brammer, M.J., Phillips, M.L., 2003. Ketamine alters neural processing of facial emotion recognition in healthy men: an fMRI study. Neuroreport 14, 387–391. Adolphs, R., Tranel, D., Damasio, H., Damasio, A., 1994. Impaired recognition of emotion in facial expressions following bilateral damage to the human amygdala. Nature 372, 669–672. Baas, D., Aleman, A., Kahn, R.S., 2004. Lateralization of amygdala activation: a systematic review of functional neuroimaging studies. Brain Research Brain Research Reviews 45, 96–103. Bonson, K.R., Grant, S.J., Contoreggi, C.S., Links, J.M., Metcalfe, J., Weyl, H.L., Kurian, V., Ernst, M., London, E.D., 2002. Neural systems and cue-induced cocaine craving. Neuropsychopharmacology. 26, 376–386. Breiter, H.C., Etcoff, N.L., Whalen, P.J., Kennedy, W.A., Rauch, S.L., Buckner, R.L., Strauss, M.M., Hyman, S.E., Rosen, B.R., 1996. Response and habituation of the human amygdala during visual processing of facial expression. Neuron 17, 875–887. Buchanan, T.W., Lutz, K., Mirzazade, S., Specht, K., Shah, N.J., Zilles, K., Jancke, L., 2000. Recognition of emotional prosody and verbal components of spoken language: an fMRI study. Brain Research Cognitive Brain Research 9, 227–238. Cahill, L., Haier, R.J., White, N.S., Fallon, J., Kilpatrick, L., Lawrence, C., Potkin, S.G., Alkire, M.T., 2001. Sex-related difference in amygdala activity during emotionally influenced memory storage. Neurobiology of Learning and Memory 75, 1–9. Calder, A.J., Young, A.W., 2005. Understanding the recognition of facial identity and facial expression. Nature Reviews Neuroscience 6, 641–651. Davis, M., Whalen, P.J., 2001. The amygdala: vigilance and emotion. Molecular Psychiatry 6, 13–34. Dolan, R.J., 2002. Emotion, cognition, and behavior. Science 298, 1191–1194. Edwards, J., Jackson, H.J., Pattison, P.E., 2002. Emotion recognition via facial expression and affective prosody in schizophrenia: a methodological review. Clinical.Psychology Review 22, 789–832. Fischer, H., Furmark, T., Wik, G., Fredrikson, M., 2000. Brain representation of habituation to repeated complex visual stimulation studied with PET. Neuroreport 11, 123–126. Fitzgerald, D.A., Angstadt, M., Jelsone, L.M., Nathan, P.J., Phan, K.L., 2006. Beyond threat: amygdala reactivity across multiple expressions of facial affect. Neuroimage 30, 1441–1448. Friston, K.J., Rotshtein, P., Geng, J.J., Sterzer, P., Henson, R.N., 2006. A critique of functional localisers. Neuroimage 30, 1077–1087.

205

Grant, S., London, E.D., Newlin, D.B., Villemagne, V.L., Liu, X., Contoreggi, C., Phillips, R.L., Kimes, A.S., Margolin, A., 1996. Activation of memory circuits during cue-elicited cocaine craving. Proceedings of the National Academy of Sciences of the United States of America 93, 12040–12045. Gur, R.C., Sara, R., Hagendoorn, M., Marom, O., Hughett, P., Macy, L., Turner, T., Bajcsy, R., Posner, A., Gur, R.E., 2002a. A method for obtaining 3-dimensional facial expressions and its standardization for use in neurocognitive studies. Journal of Neuroscience Methods 115, 137–143. Gur, R.C., Schroeder, L., Turner, T., McGrath, C., Chan, R.M., Turetsky, B.I., Alsop, D., Maldjian, J., Gur, R.E., 2002b. Brain activation during facial emotion processing. Neuroimage 16, 651–662. Gur, R.E., McGrath, C., Chan, R.M., Schroeder, L., Turner, T., Turetsky, B.I., Kohler, C., Alsop, D., Maldjian, J., Ragland, J.D., Gur, R.C., 2002c. An fMRI study of facial emotion processing in patients with schizophrenia. American Journal of Psychiatry 159, 1992–1999. Hariri, A.R., Tessitore, A., Mattay, V.S., Fera, F., Weinberger, D.R., 2002. The amygdala response to emotional stimuli: a comparison of faces and scenes. Neuroimage 17, 317–323. Holt, D.J., Weiss, A.P., Rauch, S.L., Wright, C.I., Zalesak, M., Goff, D.C., Ditman, T., Welsh, R.C., Heckers, S., 2005. Sustained activation of the hippocampus in response to fearful faces in schizophrenia. Biological Psychiatry 57, 1011–1019. Holt, D.J., Titone, D., Long, L.S., Goff, D.C., Cather, C., Rauch, S.L., Judge, A., Kuperberg, G.R., 2006. The misattribution of salience in delusional patients with schizophrenia. Schizophrenia Research 83, 247–256. Kosaka, H., Omori, M., Murata, T., Iidaka, T., Yamada, H., Okada, T., Takahashi, T., Sadato, N., Itoh, H., Yonekura, Y., Wada, Y., 2002. Differential amygdala response during facial recognition in patients with schizophrenia: an fMRI study. Schizophrenia Research 57, 87–95. Liu, J., Harris, A., Kanwisher, N., 2002. Stages of processing in face perception: an MEG study. Nature Neuroscience 5, 910–916. London, E.D., Stapleton, J.M., Phillips, R.L., Grant, S.J., Villemagne, V.L., Liu, X., Soria, R., 1996. PET studies of cerebral glucose metabolism: acute effects of cocaine and long-term deficits in brains of drug abusers. NIDA Research Monographs 163, 146–158. Mackiewicz, K.L., Sarinopoulos, I., Cleven, K.L., Nitschke, J.B., 2006. The effect of anticipation and the specificity of sex differences for amygdale and hippocampus function in emotional memory. Proceedings of the National Academy of Sciences of the United States of America 103, 14,200–14,205. Maldjian, J.A., Laurienti, P.J., Kraft, R.A., Burdette, J.H., 2003. An automated method for neuroanatomic and cytoarchitectonic atlasbased interrogation of fMRI data sets. Neuroimage 19, 1233–1239. Martin, A., 1999. Automatic activation of the medial temporal lobe during encoding: lateralized influences of meaning and novelty. Hippocampus 9, 62–70. Minzenberg, M.J., Fan, J., New, A.S., Tang, C.Y., Siever, L.J., 2007. Fronto-limbic dysfunction in response to facial emotion in borderline personality disorder: an event-related fMRI study. Psychiatry Research: Neuroimaging 155, 231–243. Phan, K.L., Wager, T., Taylor, S.F., Liberzon, I., 2002. Functional neuroanatomy of emotion: a meta-analysis of emotion activation studies in PET and fMRI. Neuroimage 16, 331–348. Phillips, M.L., Drevets, W.C., Rauch, S.L., Lane, R., 2003a. Neurobiology of emotion perception I: the neural basis of normal emotion perception. Biological.Psychiatry 54, 504–514.

206

E. Fernandez-Egea et al. / Psychiatry Research: Neuroimaging 171 (2009) 199–206

Phillips, M.L., Drevets, W.C., Rauch, S.L., Lane, R., 2003b. Neurobiology of emotion perception II: implications for major psychiatric disorders. Biological Psychiatry 54, 515–528. Reiman, E.M., Lane, R.D., Ahern, G.L., Schwartz, G.E., Davidson, R.J., Friston, K.J., Yun, L.S., Chen, K., 1997. Neuroanatomical correlates of externally and internally generated human emotion. American Journal of Psychiatry 154, 918–925. Rilling, J.K., Winslow, J.T., Kilts, C.D., 2004. The neural correlates of mate competition in dominant male rhesus macaques. Biological Psychiatry 56, 364–375. Sarro, S., Duenas, R.M., Ramirez, N., Arranz, B., Martinez, R., Sanchez, J.M., Gonzalez, J.M., Salo, L., Miralles, L., San, L., 2004. Cross-cultural adaptation and validation of the Spanish version of the Calgary depression scale for schizophrenia. Schizophrenia Research 68, 349–356. Saxe, R., Brett, M., Kanwisher, N., 2006. Divide and conquer: a defense of functional localizers. Neuroimage 30, 1088–1096. Sheline, Y.I., Barch, D.M., Donnelly, J.M., Ollinger, J.M., Snyder, A.Z., Mintun, M.A., 2001. Increased amygdala response to masked emotional faces in depressed subjects resolves with antidepressant treatment: an fMRI study. Biological Psychiatry 50, 651–658. Silver, H., Shlomo, N., Turner, T., Gur, R.C., 2002. Perception of happy and sad facial expressions in chronic schizophrenia: evidence for two evaluative systems. Schizophrenia Research 55, 171–177. Streit, M., Dammers, J., Simsek-Kraues, S., Brinkmeyer, J., Wolwer, W., Ioannides, A., 2003. Time course of regional brain activations during facial emotion recognition in humans. Neuroscience Letters 342, 101–104. Takahashi, H., Koeda, M., Oda, K., Matsuda, T., Matsushima, E., Matsuura, M., Asai, K., Okubo, Y., 2004. An fMRI study of differential neural response to affective pictures in schizophrenia. Neuroimage 22, 1247–1254. Takahashi, H., Matsuura, M., Yahata, N., Koeda, M., Suhara, T., Okubo, Y., 2006. Men and women show distinct brain activations during imagery of sexual and emotional infidelity. Neuroimage 32, 1299–1307. Taylor, S.F., Liberzon, I., Decker, L.R., Koeppe, R.A., 2002. A functional anatomic study of emotion in schizophrenia. Schizophrenia Research 58, 159–172. Tulving, E., Kapur, S., Craik, F.I., Moscovitch, M., Houle, S., 1994. Hemispheric encoding/retrieval asymmetry in episodic memory:

positron emission tomography findings. Proceedings of the National Academy of Sciences of the United States of America 91, 2016–2020. Wedig, M.M., Rauch, S.L., Albert, M.S., Wright, C.I., 2005. Differential amygdala habituation to neutral faces in young and elderly adults. Neuroscience Letters 385, 114–119. Whalen, P.J., Shin, L.M., McInerney, S.C., Fischer, H., Wright, C.I., Rauch, S.L., 2001. A functional MRI study of human amygdala responses to facial expressions of fear versus anger. Emotion 1, 70–83. Whittaker, J.F., Deakin, J.F., Tomenson, B., 2001. Face processing in schizophrenia: defining the deficit. Psychololical Medicine 31, 499–507. Williams, M.A., Morris, A.P., McGlone, F., Abbott, D.F., Mattingley, J.B., 2004. Amygdala responses to fearful and happy facial expressions under conditions of binocular suppression. Journal of Neuroscience 24, 2898–2904. Wright, C.I., Fischer, H., Whalen, P.J., McInerney, S.C., Shin, L.M., Rauch, S.L., 2001. Differential prefrontal cortex and amygdala habituation to repeatedly presented emotional stimuli. Neuroreport 12, 379–383. Wright, C.I., Martis, B., Schwartz, C.E., Shin, L.M., Fischer, H.H., McMullin, K., Rauch, S.L., 2003. Novelty responses and differential effects of order in the amygdala, substantia innominata, and inferior temporal cortex. Neuroimage 18, 660–669. Yang, T.T., Menon, V., Eliez, S., Blasey, C., White, C.D., Reid, A.J., Gotlib, I.H., Reiss, A.L., 2002. Amygdalar activation associated with positive and negative facial expressions. Neuroreport 13, 1737–1741. Zald, D.H., Prado, J.V., 1997. Emotion, olfaction, and the human amygdala: amygdala activation during aversive olfactory stimulation. Proceedings of the National Academy of Sciences of the United States of America 94, 4119–4124. Zald, D.H., 2003. The human amygdala and the emotional evaluation of sensory stimuli. Brain Research Reviews 41, 88–123. Zalla, T., Koechlin, E., Pietrini, P., Basso, G., Aquino, P., Sirigu, A., Grafman, J., 2000. Differential amygdala responses to winning and losing: a functional magnetic resonance imaging study in humans. European Journal of Neuroscience 12, 1764–1770.