Journal of Clinical Neuroscience 17 (2010) 311–314
Contents lists available at ScienceDirect
Journal of Clinical Neuroscience journal homepage: www.elsevier.com/locate/jocn
Clinical Study
Effect of image analysis software on neurofunctional activation during processing of emotional human faces P. Fusar-Poli a,b,*, S. Bhattacharyya a, P. Allen a, J.A. Crippa a,c, S. Borgwardt a,d, R. Martin-Santos a, M. Seal a, C. O’Carroll a, Z. Atakan a, A.W. Zuardi c, P. McGuire a a
Neuroimaging Section, Division of Psychological Medicine, PO67, Institute of Psychiatry, King’s College London, De Crespigny Park 103, Denmark Hill, London SE5 8AF, UK Section of Psychiatry, Department of Health Sciences, University of Pavia, Pavia, Italy Department of Neurology, Psychiatry and Medical Psychology, Faculty of Medicine of Ribeirão Preto, University of Sao Paulo, Sao Paulo, Brazil d Psychiatric Outpatient Department, University Hospital Basel, Basel, Switzerland b c
a r t i c l e
i n f o
Article history: Received 30 March 2009 Accepted 25 June 2009
Keywords: fMRI Image analysis Software SPM XBAM Meta-analysis
a b s t r a c t Functional brain imaging techniques such as functional MRI (fMRI) that allow the in vivo investigation of the human brain have been exponentially employed to address the neurophysiological substrates of emotional processing. Despite the growing number of fMRI studies in the field, when taken separately these individual imaging studies demonstrate contrasting findings and variable pictures, and are unable to definitively characterize the neural networks underlying each specific emotional condition. Different imaging packages, as well as the statistical approaches for image processing and analysis, probably have a detrimental role by increasing the heterogeneity of findings. In particular, it is unclear to what extent the observed neurofunctional response of the brain cortex during emotional processing depends on the fMRI package used in the analysis. In this pilot study, we performed a double analysis of an fMRI dataset using emotional faces. The Statistical Parametric Mapping (SPM) version 2.6 (Wellcome Department of Cognitive Neurology, London, UK) and the XBAM 3.4 (Brain Imaging Analysis Unit, Institute of Psychiatry, Kings College London, UK) programs, which use parametric and non-parametric analysis, respectively, were used to assess our results. Both packages revealed that processing of emotional faces was associated with an increased activation in the brain’s visual areas (occipital, fusiform and lingual gyri), in the cerebellum, in the parietal cortex, in the cingulate cortex (anterior and posterior cingulate), and in the dorsolateral and ventrolateral prefrontal cortex. However, blood oxygenation level-dependent (BOLD) response in the temporal regions, insula and putamen was evident in the XBAM analysis but not in the SPM analysis. Overall, SPM and XBAM analyses revealed comparable whole-group brain responses. Further studies are needed to explore the between-group compatibility of the different imaging packages in other cognitive and emotional processing domains. Ó 2009 Elsevier Ltd. All rights reserved.
1. Introduction Facial expressions are powerful non-verbal displays of emotion that provide valence information to others, and are vital in the complex social world.1 Recognition of facial expressions helps us to detect the emotional state of another person and provides cues on how to respond during social interactions.2,3 Given their crucial role in social functioning, over the past two decades neuroscientists have shown much interest in understanding the neural mechanisms that support face perception.4 In particular, functional brain imaging techniques such as functional MRI (fMRI), which allow the in vivo investigation of the human brain, have been increasingly employed to examine the neurophysiological sub* Corresponding author. Tel.: +44 77 8666 6570; fax: +44 20 7848 0976. E-mail address:
[email protected] (P. Fusar-Poli). 0967-5868/$ - see front matter Ó 2009 Elsevier Ltd. All rights reserved. doi:10.1016/j.jocn.2009.06.027
strate of emotional processing. As the signals of the basic human emotions are universal, fMRI studies that explore the neural substrates of emotion recognition no longer rely on vague subjective measures that plagued earlier research in the field, resulting in the recent publication of many fMRI studies employing facial stimuli.5 However, despite the growing literature, the neural networks underlying the different basic emotions are not fully clarified.6 Methodological factors may account for the considerable heterogeneity in findings across the neuroimaging studies that have examined the neural correlates of emotion recognition. These factors include differences in task design, lack of power due to small sample size, differences in the sociodemographic profile of the samples, and confounding effects of medication or illness chronicity. Another important cause of heterogeneity in the results, which is seldom acknowledged, stems from the use of different methods of image analysis. Despite the increasing number of fMRI image analysis
312
P. Fusar-Poli et al. / Journal of Clinical Neuroscience 17 (2010) 311–314
software programs available, the comparability of results derived using different analytical methods has not been properly tested. To our knowledge, no study has addressed the impact of different statistical approaches (i.e. parametric versus non-parametric) on the imaging results across studies. This has made it particularly difficult to interpret the differences in functional activation patterns between studies employing emotional facial stimuli. Analysis of the consistency and comparability of the results obtained using different image analysis programs on the same set of neuroimaging data is a crucial prerequisite for accurate localization of various brain functions.6 Without reliable neurophysiological maps that are consistent irrespective of the image analysis software used to obtain them, it is even more difficult to definitively ascertain which of the alterations in brain activation observed in a clinical population (e.g. depressed or psychotic subjects) are due to a neurobiological abnormality that is a key feature of the disease, as opposed to other methodological issues related to the specific analysis software used. To address this issue, we undertook a pilot study to investigate the extent to which the results of neuroimaging studies are influenced by the image analysis software used. Using two different image analysis software programs, we examined the same fMRI dataset where brain activation response to emotional faces was acquired. We selected two widely used image analysis software packages, Statistical Parametric Mapping (SPM) version 2.6 (Wellcome Department of Cognitive Neurology, London, UK) and XBAM version 3.4 (Brain Imaging Analysis Unit, Institute of Psychiatry, Kings College London, UK) which use parametric and non-parametric analysis, respectively, to analyze neuroimaging data. We tested the hypothesis that the image analysis software would have a significant effect on the results obtained from the analysis of fMRI data acquired during the processing of emotional faces.
don. T2-weighted scans were acquired with a repetition time (TR) of 2 s, an echo time (TE) of 40 ms, and a flip angle of 90° in 16 axial planes (7 mm thick), parallel to the anterior commissure (AC)–posterior commissure (PC) line. To facilitate anatomical localization of activation, a high-resolution inversion recovery image dataset was also acquired, with 3 mm contiguous slices and an in-plane resolution of 3 mm (TR 16,000 ms, inversion time [TI] 180 ms, TE 80 ms). For the purpose of this study, the images were analyzed twice using different image analysis software (SPM and XBAM). 2.4. fMRI analysis using statistical parametric mapping
2. Patients and methods
The first analysis was performed using SPM software running under the Matrix Laboratory (MATLAB) 6.4 environment (MathWorks, Cambridge, UK). All volumes were realigned to the first volume, corrected for motion artefact, the mean adjusted by proportional scaling, normalized into standard stereotactic space (template provided by the Montreal Neurological Institute) and smoothed using a 6 mm full width at half maximum (FWHM) Gaussian kernel. The time series were filtered (filter width was 128 s) to eliminate low-frequency components and adjusted for systematic differences across trials. The onset times (in seconds) for each trial of neutral, mildly fearful and prototypically fearful faces were convolved with a canonical hemodynamic response function. To minimize the potential confounding effects of between-condition variation in task performance, the analysis of the blood oxygenation level-dependent (BOLD) response in each participant was modelled using only trials associated with the correct responses. Each task condition (neutral and fearful) was then contrasted against the baseline condition (cross fixation). Whole-brain voxel threshold was set at p = 0.001, uncorrected, and the extent was set at 20.10 Regional activation results are reported at a cluster threshold of p < 0.05, corrected.
2.1. Participants
2.5. fMRI analysis using XBAM software
Fifteen healthy, native English-speaking, right-handed males (mean age 26.67 years, standard deviation [SD] 5.7 years, age range 18–35 years) who had no personal or familial psychiatric illness and no alcohol or other significant drug abuse, or dependence, participated in this study. The mean intelligence quotient (IQ) measured using the National Adult Reading Test was 98.67 (SD 7.0).7 Participants were asked to abstain from any prescription medication or recreational drug use for the duration of the study, and to not consume alcohol or caffeine during the 24 hours and 12 hours, respectively, before each study day.
The second analysis was performed using XBAM. During this analysis, the data were first realigned and then smoothed using a Gaussian filter (FWHM 7.2 mm).11 Responses to the experimental paradigms were detected by convolving each component of the design with each of two gamma variate functions (peak responses at 4 s and 8 s, respectively). The best fit between the weighted sum of these convolutions and the time series at each voxel was calculated using the constrained BOLD effect model.12 A goodness of fit statistic comprising the ratio of the sum of squares (SSQ) ratio of deviations from the mean image intensity (over the whole time series) divided by the SSQ of deviations due to the residuals was then computed at each voxel. The data were then permuted by a wavelet-based method to calculate the null distribution of SSQ ratios under the assumption of no experimentally determined response.13 This was used to calculate the critical SSQ ratio value and then find the threshold for the neurofunctional maps at a type I error rate of < 1. The detection of activated voxels was then extended from voxel to cluster level.14 In order to minimize the potential confounding effects of between-condition variation in task performance, the analysis of the BOLD response data of each participant was modelled using only trials associated with correct responses. In addition to the SSQ ratio, the size of the BOLD response to each experimental condition was calculated for each individual at each voxel as a percentage of the mean resting image intensity level. In order to determine the size of the BOLD effect, the difference between the maximum and minimum values of the fitted model for each condition was expressed as a percentage of the mean image intensity level over the whole time series. The SSQ ratio maps for each individual were transformed into the standard space of Talairach and Tournoux using a two-stage
2.2. fMRI paradigm The participants underwent one 6-minute experiment employing event-related fMRI, where they were presented with 10 different facial identities, each one expressing either a neutral expression or different intensities of fear.8 There were 30 different facial stimuli in total, each of which was presented to participants twice for 2 s each time; therefore, participants viewed 60 facial stimuli in total. The order of facial identities and expression type was pseudorandomized such that there was no successive presentation of the same identity or facial expression type. During the interstimulus interval, the duration of which varied from 3 s to 8 s according to a Poisson distribution, with an average interval of 5.9 s, individuals viewed a fixation cross.9 2.3. Image acquisition Images were acquired using a 1.5 Tesla Sigma system (GE Healthcare, Waukesha, WI, USA) at the Maudsley Hospital in Lon-
P. Fusar-Poli et al. / Journal of Clinical Neuroscience 17 (2010) 311–314
warping procedure.15,16 Group activation maps were computed by determining the median SSQ ratio at each voxel (across all participants) in the observed and permuted data maps. The distribution of the median SSQ ratios from the permuted data was used to derive the null distribution of SSQ ratios, as well as the critical SSQ ratio to threshold group activation maps at a cluster level threshold of < 1 expected type I error cluster per brain. The experimental conditions (neutral, fearful) were contrasted against the baseline condition (fixation cross). The threshold for cluster level analysis was chosen to give < 1 false activated cluster per brain (cluster p < 0.05). 3. Results 3.1. XBAM software During emotional processing there was increased activation in a wide neural network: occipital lobe, including the middle occipital gyrus, fusiform gyrus, lingual gyrus and cuneus; parietal lobe, including the post-central gyrus, precuneus and supramarginal gyrus; temporal lobe, including the middle and inferior temporal gyri; frontal lobe, including the middle/medial gyri, inferior and superior frontal gyri and pre-central gyrus; and in the sublobar regions, such as the insula and putamen (cluster p < 0.05, voxel p = 0.01), anterior and posterior cingulate and cerebellum bilaterally (Supplementary Fig. 1). 3.2. SPM software The SPM analysis showed that processing of emotional faces was associated with activation in the left cuneus and lingual gyrus, the right superior occipital gyrus, the fusiform gyrus and the cerebellum bilaterally, the anterior and posterior cingulate cortex, the left inferior and superior parietal lobule, and the right middle frontal, right inferior frontal and left superior frontal gyri (Supplementary Fig. 2). 4. Discussion The principal aim of our study was to test comparability of results derived using different image analysis packages while examining functional brain activations during emotional processing. To address this hypothesis, we have analysed the same fMRI dataset twice, using both the SPM and XBAM programs. The SPM approach (http://www.fil.ion.ucl.ac.uk/spm) is based on parametric statistical models and uses the general linear model to describe the data in terms of experimental and confounding effects, and residual variability. XBAM is an in-house program (http://www.brainmap.co.uk/xbam.htm) and employs median statistics rather than mean statistics to control outlier effects and permutations rather than normal theory-based inference.17 Despite these considerable differences in statistical approaches, both of these packages are widely used to analyze fMRI data.9 Overall, we uncovered an overlapping pattern of brain response across the two software packages. Analysis using both programs revealed that during the processing of emotional faces, as compared to the baseline stimulus, there was an increased activation in the visual areas (occipital, fusiform and lingual gyri), in the cerebellum, in the parietal cortex, in the cingulate cortex (anterior and posterior cingulate), and in some regions of the dorsolateral and ventrolateral prefrontal cortex. Areas within the visual cortex are generally engaged in early perceptual processing of facial stimuli, which may be independent of emotional valence.18 The cerebellum is of particular interest in the field of affective neuroscience, as it is strongly connected with the reticular system (arousal), the cortical
313
association areas (cognitive processing of emotions), and the limbic structures (emotional experience and expression) such as the amygdala, the hippocampus and the septal nuclei.19 Previous studies have indicated that cerebellar lesions result in flattening or blunting of emotions, and cerebellar activation has been observed in response to different emotions.20–22 The engagement of this area is in line with evidence that that the cerebellum has a general role in emotional processing.23 The frontal cortex, however, participates not only in general emotional processing, but also in the conscious experience of emotion, the inhibition of excessive emotion and monitoring of one’s own emotional state to make relevant decisions.24,25 In particular, the anterior cingulate seems specifically involved in arousal to an emotive visual stimulus.5 Specifically, this region has a key role in complex aspects of emotional processing, such as social interaction, by virtue of its connections with the discrete parts of the temporal lobe and the subcortical structures that control autonomic responses.26 Overall, previous research has confirmed that the neural network described in the preceding paragraph is the most likely to be activated during the processing of an emotional human face. These findings confirm that recognizing emotion from facial expressions draws on diverse psychological processes implemented in a large array of neural structures.27 Although the exact functional interplay between the brain areas is unclear, some authors suggest that early perceptual processing of faces draws on cortices in occipital and temporal lobes that construct detailed representations from the configuration of facial features.18 Subsequent recognition requires a set of structures, including limbic and orbitofrontal cortices that link perceptual representations of the face to the generation of knowledge about the emotion signaled.18 Conversely, the temporal regions, insula and putamen were found to be activated using the XBAM analysis software only. Interestingly, significant linear increases in response to increasing intensities of fear have been shown in the mesial temporal, anterior insula and putamen.28 In particular, behavioral and neurobiological accounts have suggested that the insula is relevant to the neurobiological models of negative emotions, as insulae in primates contain neurons that respond to pleasant and unpleasant tastes.5,29 Some authors have speculated that whereas limbic regions are particularly involved in the emotional response to exteroceptive sensory stimuli, the mesial temporal lobe and the insular cortex are preferentially involved in the emotional response to potentially distressing stimuli, interoceptive sensory stimuli and body sensations.5 There are various ways in which the differences in the SPM and XBAM findings can be explained. For example, non-parametric approaches are less sensitive to outliers, particularly when applied to small sample sizes.17 It is possible that the differential sensitivity of these packages to activations in the temporal and insular regions can be driven by some outliers. According to some authors approaches using a mixed effects analysis, and permutation-based and cluster level inference when analyzing small samples may be more valid than analyses involving simple random effects and voxel-level inference.30 Overall, it is possible to speculate that, despite the different statistical approaches, XBAM and SPM are powerful tools that can be used to uncover the true underlying neurophysiological responses. Comparability between these two packages is high in the cerebellum and visual, parietal and prefrontal areas, and low in temporal and insular regions. To our knowledge, this is the first study to provide evidence that, among certain brain areas, these two packages offer comparable results during tasks that elicit emotional processing. This is of great interest, as differences in BOLD responses between fMRI studies employing different statistical approaches are not comparable a priori. Most research questions in affective
314
P. Fusar-Poli et al. / Journal of Clinical Neuroscience 17 (2010) 311–314
neuroscience can no longer be addressed by the isolated analysis of single experiments alone, but necessitate the consolidation of results across different studies (which usually show a small sample size and a low statistical power).6 Recent advances in meta-analytical fMRI methods allow researchers to pool fMRI studies utilizing different statistical analyses to conduct powerful computations in large voxel-based databases.31 However, the heterogeneity of the analytical methods employed across studies is a major limitation to fMRI meta-analyses. Our results show, for the first time, that the comparison of findings across fMRI studies employing emotional faces is not straightforward, and should take into consideration the problem of different statistical approaches as a confounding variable. We also hope to encourage affective neuroscience research towards a better identification of neurobiological endophenotypes of major psychiatric diseases. Facial emotional stimuli may serve as a valid tool, tapping into neural networks implicated in emotional processing under different psychiatric conditions.32 For instance, depressed subjects show a state-related positive bias toward negative emotional cues, and a bias away from positive emotional cues.33 Consistent with these observations, there are data to suggest there may be an analogous, state-related negative recognition bias for negative emotions in mania.34However, specific alterations of emotional face processing by schizophrenics include bias for threat-related emotional material that may be regarded with increased significance by delusion-prone individuals; it is possible that this bias is involved in the formation of delusional beliefs.1,35,36 Future fMRI meta-analyses need to control their results for the imaging method employed in order to depict true neurofunctional maps of psychiatric disorders. Limitations of our study are well acknowledged. Even if we have shown that SPM and XBAM can reveal comparable brain maps of emotional processing at the level of a single group analysis, this may not apply to other cognitive or emotional paradigms or to between-groups contrasts. 5. Conclusions The comparison of findings across fMRI studies employing emotional faces is not straightforward, and should take into consideration the problem of different statistical approaches as a confounding variable. Future studies exploring the comparability of different image analysis packages are necessary to reduce the sources of heterogeneity of fMRI studies, and to sustain the ongoing research in basic and clinical neurosciences. Appendix A. Supplementary data Supplementary data associated with this article can be found, in the online version, at doi:10.1016/j.jocn.2009.06.027. References 1. Phillips ML, David AS. Facial processing in schizophrenia and delusional misidentification: cognitive neuropsychiatric approaches. Schizophr Res 1995;17:109–14. 2. Frank MG, Stennett J. The forced-choice paradigm and the perception of facial expressions of emotion. J Pers Soc Psychol 2001;80:75–85. 3. Grossmann T, Johnson MH. The development of the social brain in human infancy. Eur J Neurosci 2007;25:909–19. 4. Peelen MV, Downing PE. The neural basis of visual body perception. Nat Rev Neurosci 2007;8:636–48.
5. Husted DS, Shapira NA, Goodman WK. The neurocircuitry of obsessive– compulsive disorder and disgust. Prog Neuropsychopharmacol Biol Psychiatry 2006;30:389–99. 6. Neumann J, von Cramon DY, Lohmann G. Model-based clustering of metaanalytic functional imaging data. Hum Brain Mapp 2008;29:177–92. 7. Willshire D, Kinsella G, Prior M. Estimating WAIS-R IQ from the National Adult Reading Test: a cross-validation. J Clin Exp Neuropsychol 1991;13:204–16. 8. Young A, Perret D, Calder A, et al. Facial expressions of emotion: stimuli and test (FEEST). Edmunds, Suffolk, England. Thames Valley Test Company; 2002. 9. Surguladze S, Brammer MJ, Keedwell P, et al. A differential pattern of neural response toward sad versus happy facial expressions in major depressive disorder. Biol Psychiatry 2005;57:201–9. 10. Crippa JA, Zuardi AW, Garrido GE, et al. Effects of cannabidiol (CBD) on regional cerebral blood flow. Neuropsychopharmacology 2004;29:417–26. 11. Bullmore ET, Brammer MJ, Rabe-Hesketh S, et al. Methods for diagnosis and treatment of stimulus-correlated motion in generic brain activation studies using fMRI. Hum Brain Mapp 1999;7:38–48. 12. Friman O, Borga P, Lundberg P, et al. Adaptive analysis of fMRI data. Neuroimage 2003;19:837–45. 13. Phillips ML, Medford N, Young AW, et al. Time courses of left and right amygdalar responses to fearful facial expressions. Hum Brain Mapp 2001;12:193–202. 14. Bullmore ET, Suckling J, Overmayer S, et al. Global, voxel and cluster tests, by theory and permutation, for a difference between two groups of structural MR images of the brain. IEEE Trans Med Imaging 1999;18:32–42. 15. Talairach J, Tournoux P. Co-planar stereotaxic atlas of the human brain. New York. Thieme Publishing Group; 1988. 16. Phillips ML, Young AW, Senior C, et al. A specific neural substrate for perceiving facial expressions of disgust. Nature 1997;389:495–8. 17. Brammer MJ, Bullmore ET, Simmons A, et al. Generic brain activation mapping in functional magnetic resonance imaging: a nonparametric approach. Magn Reson Imaging 1997;15:763–70. 18. Adolphs R. Neural systems for recognizing emotion. Curr Opin Neurobiol 2002;12:169–77. 19. Baillieux H, De Smet HJ, Paquier PF, et al. Cerebellar neurocognition: insights into the bottom of the brain. Clin Neurol Neurosurg 2008;110:763–73. 20. Schmahmann J. The role of cerebellum in affect and psychosis. J Neurolinguistics 2000;13:189–214. 21. Reiman E, Lane R, Ahern G, et al. Neuroanatomical correlates of externally and internally generated human emotion. Am J Psychiatry 1997;154:918–25. 22. Sacchetti B, Baldi E, Lorenzini CA, et al. Cerebellar role in fear-conditioning consolidation. Proc Natl Acad Sci USA 2002;99:8406–11. 23. Turner BM, Paradiso S, Marvel CL, et al. The cerebellum and emotional experience. Neuropsychologia 2007;45:1331–41. 24. Phan KL, Wager T, Taylor SF, et al. Functional neuroanatomy of emotion: a meta-analysis of emotion activation studies in PET and fMRI. Neuroimage 2002;16:331–48. 25. Scheuerecker J, Frodl T, Koutsouleris N, et al. Cerebral differences in explicit and implicit emotional processing – an fMRI study. Neuropsychobiology 2007;56:32–9. 26. Rudebeck PH, Bannerman DM, Rushworth MF. The contribution of distinct subregions of the ventromedial frontal cortex to emotion, social behavior, and decision making. Cogn Affect Behav Neurosci 2008;8:485–97. 27. Vuilleumier P, Pourtois G. Distributed and interactive brain mechanisms during emotion face perception: evidence from functional neuroimaging. Neuropsychologia 2007;45:174–94. 28. Surguladze SA, Brammer MJ, Young AW, et al. A preferential increase in the extrastriate response to signals of danger. Neuroimage 2003;19:1317–28. 29. Calder AJ, Keane J, Manes F, et al. Impaired recognition and experience of disgust following brain injury. Nat Neurosci 2000;3:1077–8. 30. Thirion B, Pinel P, Meriaux S, et al. Analysis of a large fMRI cohort: statistical and methodological issues for group analysis. Neuroimage 2007;35: 105–20. 31. Laird AR, Fox PM, Price CJ, et al. ALE meta-analysis: controlling the false discovery rate and performing statistical contrasts. Hum Brain Mapp 2005;25:155–64. 32. Drevets WC, Price JL, Furey ML. Brain structural and functional abnormalities in mood disorders: implications for neurocircuitry models of depression. Brain Struct Funct 2008;213:93–118. 33. Leppanen JM. Emotional information processing in mood disorders: a review of behavioral and neuroimaging findings. Curr Opin Psychiatry 2006;19:34–9. 34. Lennox BR, Jacob R, Calder AJ, et al. Behavioural and neurocognitive responses to sad facial affect are attenuated in patients with mania. Psychol Med 2004;34:795–802. 35. Green MJ, Williams LM, Davidson DJ. Processing of threat-related affect is delayed in delusion-prone individuals. Br J Clin Psychol 2001;40:157–65. 36. Kline JS, Smith JE, Ellis HC. Paranoid and nonparanoid schizophrenic processing of facially displayed affect. J Psychiatr Res 1992;26:169–82.