Neuropsychologia 132 (2019) 107141
Contents lists available at ScienceDirect
Neuropsychologia journal homepage: www.elsevier.com/locate/neuropsychologia
Cerebellar contribution to vocal emotion decoding: Insights from stroke and neuroimaging
T
Marine Thomassona,b, Arnaud Sajc,d, Damien Benisa,b, Didier Grandjeanb, Frédéric Assalc,e, Julie Pérona,b,c,∗ a
Clinical and Experimental Neuropsychology Laboratory, Department of Psychology and Educational Sciences, University of Geneva, Switzerland Neuroscience of Emotion and Affective Dynamics Laboratory, Department of Psychology and Swiss Center for Affective Sciences, University of Geneva, Switzerland c Cognitive Neurology Unit, Department of Neurology, University Hospitals of Geneva, Geneva, Switzerland d Department of Psychology, University of Montréal, Montréal, QC, Canada e Faculty of Medicine, University of Geneva, Switzerland b
A R T I C LE I N FO
A B S T R A C T
Keywords: Emotional prosody Cerebellum Stroke Voxel-based lesion-symptom mapping
While the role of the cerebellum in emotion recognition has been explored with facial expressions, its involvement in the auditory modality (i.e., emotional prosody) remains to be demonstrated. The present study investigated the recognition of emotional prosody in 15 patients with chronic cerebellar ischaemic stroke and 15 matched healthy controls, using a validated task, as well as clinical, motor, neuropsychological, and psychiatric assessments. We explored the cerebellar lesion-behaviour relationship using voxel-based lesion-symptom mapping. Results showed a significant difference between the stroke and healthy control groups, with patients giving erroneous ratings on the Surprise scale when they listened to fearful stimuli. Moreover, voxel-based lesionsymptom mapping revealed that these emotional misattributions correlated with lesions in right Lobules VIIb, VIIIa,b and IX. Interestingly, the posterior cerebellum has previously been found to be involved in affective processing, and Lobule VIIb in rhythm discrimination. These results point to the cerebellum’s functional involvement in vocal emotion decoding.
1. Introduction While the cerebral cortex, which represents more than 80% of the brain’s mass, contains only 20% of its total number of neurons, the cerebellum holds more than 68 billion neurons (∼70%) (Diedrichsen et al., 2009; Manto, 2010; Schmahmann et al., 1999; Voogd and Glickstein, 1998). Paradoxically, human neuroscience research has largely neglected the cerebellum in favour of the cortex (Berridge, 2003). As a consequence, extremely little is known about the precise role of the cerebrum parvum in human behaviour, especially non-motor functions such as affective abilities, even though researchers first tentatively posited that the cerebellum plays a major role in human emotion back in the 1970s (Schmahmann and Sherman, 1998). Emergent research in this field is yielding some very interesting results,
albeit embryonic (e.g., Adamaszek et al., 2014; Ferrucci et al., 2012; Heilman et al., 2014; Schraa-Tam et al., 2012; Turner et al., 2007). These studies point to the cerebellum’s functional involvement in human emotion, notably its recognition. The activity of the posterior cerebellum (vermis and predominantly right hemisphere), as well as the fastigial nuclei, is of particular interest. Moreover, a number of anatomical and neuroimaging studies point to the cerebellum’s functional integration in the neural network involved in emotion processing (i.e., orbitofrontal, cingulate, temporal and insular cortices, amygdala and basal ganglia (Anand et al., 1959; Bostan et al., 2010, 2013; Bostan and Strick, 2010, Bostan, Dum and Strick, 2013; Schmahmann and Pandya, 1989, 1990, 1991, 1992, 1993, 1995, 1997; Vilensky and van Hoesen, 1981). Emotional prosody (i.e., vocal expression of emotion) refers to the
Abbreviations: AES, Apathy Evaluation Scale; AIC, Akaike information criterion; BDI-II, Beck Depression Inventory; BIC, Bayesian information criterion; DWI, diffusion-weighted imaging; FAB, Frontal Assessment Battery; FLAIR, fluid-attenuated inversion recovery; HC, healthy controls; MDRS, Mattis Dementia Rating Scale; MOCA, Montreal Cognitive Assessment; PEGA, Montreal-Toulouse auditory agnosia battery; PWI, perfusion-weighted imaging; SARA, Scale for the Assessment and Rating of Ataxia; TAS-20, Toronto Alexithymia Scale; VLSM, voxel-based lesion-symptom mapping ∗ Corresponding author. Clinical and Experimental Neuropsychology Laboratory, Faculté de Psychologie et des Sciences de l’Education, Université de Genève, 40 bd du Pont d’Arve, 1205, Geneva, Switzerland. E-mail address:
[email protected] (J. Péron). https://doi.org/10.1016/j.neuropsychologia.2019.107141 Received 28 December 2018; Received in revised form 3 July 2019; Accepted 8 July 2019 Available online 12 July 2019 0028-3932/ © 2019 Elsevier Ltd. All rights reserved.
Neuropsychologia 132 (2019) 107141
M. Thomasson, et al.
in affective processes, they concluded that emotional processes induce posterior vermal and fastigial nucleus activity. While interesting, this was only a preliminary overview, for whereas the above-cited literature review (Fusar-Poli et al., 2009) included 100 studies reporting cerebellar activation (Wager et al., 2008), Stoodley and Schmahmann (2009) identified just nine – all in the visual modality. Moreover, their meta-analysis considered affective processes that resulted from viewing and processing IAPS pictures, thus restricting these processes to their subjective feeling component. Using this material also brought methodological limitations, as spatial frequencies (i.e., parameter reflecting how rapidly a property changes in space) have been shown to differ significantly between emotional and neutral images (Delplanque, N'Diaye, Scherer and Grandjean, 2007). This could induce a perceptual bias, especially when studying the physiological components of emotion processing. Historically, the first clues that led researchers to suspect cerebellar involvement in human emotion came from clinical observations. In human adults, cerebellar cognitive affective syndrome (Schmahmann and Sherman, 1998) is now a recognized clinical entity associated with blunting of affect. This syndrome has been attributed to damage to the posterior vermis, which reduces the cerebellum’s contribution to perisylvian cortical areas via its outflow to the ventral-tier thalamic nuclei (Stoodley and Schmahmann, 2010). Since the seminal work of Schmahmann (1998), a number of clinical observations have shown that cerebellar lesions cause affective disorders such as pathological laughter and crying (Parvizi et al., 2007), and that cerebellar dysfunction is accompanied by various mental and affective disorders such as autism, schizophrenia and depression (for a review, see Villanueva, 2012). To the best of our knowledge, only one group study has so far specifically addressed the question of the cerebellum’s involvement in vocal emotion processing by studying patients with cerebellar disorders (Adamaszek et al., 2014). These authors explored the vocal modality in 15 patients with focal cerebellar lesions due to ischaemic stroke. They reported impaired emotion recognition, and indicated that these deficits were correlated neither with clinical and demographic features, nor with the patients' cognitive and psychological profiles. In contrast to these findings, a case report that examined the production and recognition of vocal emotions in a patient with idiopathic cerebellar degeneration (Heilman et al., 2014) failed to find any deficit in the recognition of emotions, whereas production was severely impaired. This discrepancy needs further investigation, but may be partially explained by the different numbers of participants included in each study (group study vs. case report), as well as plasticity/compensation mechanisms that may obviously differ between chronic degenerative disease and ischaemic stroke. Finally, interesting evidence related to the probable role of the cerebellum in vocal emotion processing comes from vocal learning studies in humans and birds, which have demonstrated that the cerebellum plays an important role in this developmental process (Hull, 2018). As far as humans are concerned, this region is highly active during speech, and speech acquisition is delayed in children with cerebellar dysfunction (Ziegler and Ackermann, 2017). Moreover, disruption of cerebellar output in early life seems to be sufficient to impact the expression of cognitive and social abilities in adulthood (Badura et al., 2018). As far as birds are concerned, Pidoux et al. (2018)) recently showed that the cerebellum influences the basal ganglia-thalamo-cortical loop, a necessary region for song learning and plasticity in birds, through its subcortical connection to the song-related basal ganglia nucleus. These authors suggested that the songbird model could be used to find out more about the interactions between the cerebellum and basal ganglia, and thus to improve understanding of pathologies caused by abnormal interactions between these regions, such as Parkinson’s disease. Taken together, these neuroanatomical, neuroimaging, and clinical results seem to indicate the functional involvement of the cerebellum in vocal emotion perception. However, in addition to discrepancies in
segmental and suprasegmental changes that occur in the course of a spoken utterance, affecting various physical properties of the sound, such as its amplitude, timing, and fundamental frequency (F0). For example, happiness is typically characterized by a rapid speech rate, high intensity, and high F0 mean and variability, making vocalizations sound both melodic and energetic. By contrast, sad vocalizations are characterized by a slow speech rate, low intensity, and low F0 mean and variability. Emotional prosody recognition has been shown to correlate with perceived modulations of these different acoustic features. The perception and decoding of vocal emotions involve a distributed neural network (for reviews, see Frühholz and Grandjean, 2012; Schirmer and Kotz, 2006), including a prefrontal-(auditory) temporal network, in addition to the amygdala (Frühholz et al., 2012; Grandjean et al., 2005; Sander et al., 2005) and basal ganglia, notably the striatum (e.g., Kotz et al., 2003) and subthalamic nucleus (for a review, see Péron et al., 2013). The question of hemispheric specialization is still subject to debate (for a review, see Stirnimann et al., 2018), and current models (Schirmer and Kotz, 2006; Wildgruber et al., 2009) assume that emotional prosody recognition is a multistep process mediated by bilateral mechanisms, at both the cerebral and subcortical levels. Over the past few years, the results of neuroanatomical, neuroimaging, and clinical studies have indicated that the cerebellum’s contribution to vocal emotion decoding has an anatomical substrate. Neuroanatomical animal studies have uncovered bidirectional pathways connecting the cerebellum to important parts of the cerebral cortex, as well as to subcortical structures involved in emotional prosody processing. These studies have shown strong prefrontal-cerebellar connectivity, in which the prefrontal cortex connects to the (contralateral) cerebellum via the pontine nuclei. In the other direction, the cerebellum sends projections to the (contralateral) prefrontal cortex via the dentate nucleus and thalamus (Schmahmann and Pandya, 1995, 1997). More specifically, Lobule VII (largely hemispheric Crus II, but also vermis) has been shown both to send and to receive projections from the prefrontal cortex (Brodmann area 46), forming a closed-loop circuit (Bostan et al., 2013). Connectivity between the cerebellum and subcortical structures involved in emotion has also been demonstrated, notably with the basal ganglia (Bostan et al., 2010; Bostan and Strick, 2010) and possibly also with the amygdaloid nuclei (Anand et al., 1959). Using retrograde transneuronal transport of a rabies virus in cebus monkeys to determine the origin of multisynaptic inputs to the injection sites, Bostan et al. (2010) recently found that the subthalamic nucleus has a substantial disynaptic projection to the cerebellar cortex. They also identified connections from the cerebellum to the basal ganglia, showing that the dentate nucleus has a disynaptic projection to the striatum. Finally, it is noteworthy that reciprocal connections link the cerebellum to brainstem areas containing neurotransmitters involved in affective regulation, including serotonin, norepinephrine, and dopamine (Dempesy et al., 1983). Taken together, these neuroanatomical results point to the existence of substantial two-way communication pathways between the cerebellum and structures known to be involved in the recognition of emotions, constituting an anatomical substrate for the cerebellum’s contribution to affective processing. To our knowledge, no neuroimaging study has ever specifically and directly addressed the question of the cerebellum’s functional involvement in human vocal emotion perception. Although numerous studies exploring the neural substrates of vocal emotion perception have reported peaks of activation in the cerebellum, these results have often been dismissed as incidental. A meta-analysis of more than 100 fMRI studies (total sample: 1600 healthy participants) of the cerebral basis of emotion recognition (which therefore did not specifically address the question of the cerebellum’s functional specialization in vocal emotion) reported that significant cerebellar activity is observable across all emotional conditions (Fusar-Poli et al., 2009). Nevertheless, its functional role in this process remains largely overlooked. When Stoodley and Schmahmann (2009) carried out an initial meta-analysis to specifically address the question of the cerebellum’s functional involvement 2
17 15# 16# 19# 5# 18# 20# 10# 20 18 13# – 13# 17 14# 15# 18 12# 15# 15# 73 73 61 58 68 50 50 77 75 72 P6 P7 P8 P9 P10 P11 P12 P13 P14 P15
M F F M F M F M M M
Right Right Left Right Right Right Right Right Right Right
18 18 9 20 14 19 19 9 22 20
LH RH RH LH LH RH RH LH RH RH
1 – 3 0 3.5 0 0 6.5 0 6
29 19# 23# 30 24# 26 27 22# 25# 28
17 6# 16# 31 25 26 21# 23# 18# 30 70 52 55 56 62 P1 P2 P3 P4 P5
M M M F F
Right Right Right Right Left
9 20 22 15 12
LH RH RH LH LH
– 0 – – 7.5
– 12# – – 13#
19 5# 9 17 16 19 29 11 21 20
6 24# 23# 5 19# 5 – 7 3 11
50 – 63# 72# 61# 36 – 35 37 46
8 – 10 0 0 0 0 6 1 2
88.43 90.74 90.62 86.63 86.14 88.46 89.84 89.31 85.71 85.72
Lobules VIIb, VIII, IX, Crus I,II Lobules III, VI, V, VI, VIII, XI, X Lobules III, VI, V, VI, Crus I Lobules VIIb, VIII, IX, Vermis VIII Lobules III, VI, V, VI, VIII, IX Vermis I, II, III, VI, V, VI, VIIb, VIII, IX, Crus I, II Lobules VIII, IX, Vermis IX Lobule VI, Crus I, II Lobules VI, VIIb, VIII, IX, Crus I, II, Vermis VIII Lobules VIIb, VIII, IX, Crus I, II Lobules VIIb, VIII, Crus I, II Lobules VIIb, VIII, IX, Crus II Lobules VIII, IX Lobules III, IV, V, VI, Vermis I, II, III, IV, V, VI Lobules VIIb, VIII. IX, Crus II Lobules VIIb, VIII, IX, Crus II, Vermis VIII – 1 – – 7 6 3# 11 14 19
8 23# 5 17 15
– 45 – – 36
89.31 94.84 89.85 89.31 88.74
Lesion location Lesion volume (voxels) AES TAS-20 BDI-II MOCA Handedness
Education (years)
Side of lesion
SARA
FAB
Cat. fluency
Act. fluency
3
Sex
One group of 15 patients with focal cerebellar lesions due to ischaemic stroke and one group of 15 HC took part in the study. The number of participants in each group was determined by a power analysis based on a pilot study with eight participants, using the simR package. This power analysis was performed on the generalized linear mixed model that was subsequently used in the statistical analysis. The estimated power crossed the 80% threshold for each main effect and covariate, as well as for the three-way interaction of interest, for 30 participants (i.e., 15 in each group). The clinical group was recruited at the Neurology Department of the University Hospitals of Geneva, and included nine men and six women with first-ever ischaemic stroke (> 3 months prior to enrolment, corresponding to chronic stroke). Their mean age was 63.5 years (SE = 9.61, range = 50–77) and they were all French speakers. According to the criteria of the Edinburgh Handedness Inventory (Oldfield, 1971), 13 were right-handed and two were left-handed. Their mean education level was 15.8 years (SE = 5.06, range = 9–22). Eight patients had right cerebellar hemisphere damage, while seven had left
Age
2.1. Participants (Table 1)
Patient
Table 1 Clinical, demographic, and neuropsychological data for the group of patients with cerebellar stroke.
2. Participants and methods
#
results, a number of important methodological issues dramatically limit the inferences that can be drawn from these studies. These limitations mainly concern the design of the tasks (both emotional and control) and the choice of stimuli. First, the number of stimuli presented to the patients (especially in Adamaszek et al., 2014; Heilman et al., 2014) was critically low, and probably not sufficient to obtain the variance needed to guarantee the correct use of the statistical tests chosen by the researchers. Second, these auditory stimuli consisted of sentences with semantic content, which are known to induce bias when studying emotional prosody. It is preferable to use pseudosentences, precisely to avoid a possible confound with semantic content (Juslin and Scherer, 2005). Third, the tasks that were used (e.g., discrimination or categorization) are far less sensitive to emotional effects than visual (continuous) analogue scales, chiefly because they can induce ceiling effects and/or categorization biases (Scherer and Ekman, 2008). Fourth and last, none of the studies controlled for basic sensory processing in these patients. The absence of a basic auditory impairment is a prerequisite for investigating the recognition of vocal emotions. In this context, the aim of the present study was to study the cerebellum’s involvement in the recognition of emotional prosody. To this end, we assessed the vocal emotion recognition performances of 15 patients with focal cerebellar lesions due to ischaemic stroke, comparing them with a group of 15 matched healthy control participants (HC). We used a previously validated methodology that has been shown to be sensitive enough to detect even slight emotional impairments (Péron et al., 2011; Péron et al., 2014; Péron et al., 2010). This features visual (continuous) analogue scales, which do not induce categorization biases, contrary to categorization and forced-choice tasks (naming of emotional faces and emotional prosody). Based on the only clinical group study to have explored the recognition of vocal expressions in patients with cerebellar disorders (Adamaszek et al., 2014), we predicted that patients would exhibit a deficit in the recognition of vocal expressions (for all emotions, but not for neutral), compared with the HC group. Regarding anatomicalfunctional predictions, no previous neuroimaging study had explored the role of the cerebellum in the recognition of vocal emotions. However, based on the earlier meta-analysis (Stoodley and Schmahmann, 2009), as well as on neuroanatomical animal studies (Schmahmann and Pandya, 1995, 1997; Stoodley and Schmahmann, 2009), we postulated that posterior (vermal and right hemispheric) cerebellar lesions would best explain the severity of these emotional disturbances, such that the greater the confusion in the emotion ratings, the greater the lesions located in these regions.
Note. Dashes indicate missing values. Act. fluency: action verb fluency task; AES: Apathy Evaluation Scale; BDI-II: Beck Depression Inventory; Cat. fluency: categorical fluency task; FAB: Frontal Assessment Battery; MOCA: Montreal Cognitive Assessment; PEGA: Montreal-Toulouse auditory agnosia battery; SARA: Scale for the Assessment and Rating of Ataxia; TAS-20: Toronto Alexithymia Scale. # scores below the clinical threshold.
Neuropsychologia 132 (2019) 107141
M. Thomasson, et al.
Neuropsychologia 132 (2019) 107141
M. Thomasson, et al.
Table 2 Means ratings and 95% confidence intervals (CI95%) on emotion scales in the emotional prosody recognition task for the stroke and HC groups. Stroke (n = 15) Emotion Happiness Mean Anger 2.57 Fear 1.20 Happiness 30.80 Neutral 8.43 Sadness 1.64 HC (n = 15) Happiness Emotion Mean Anger 3.00 Fear 2.97 Happiness 39.13 Neutral 14.29 Sadness 1.71
scale CI95% −1.7; 6.8 −3.0; 5.5 26.5; 35.1 4.2; 12.7 −2.6; 5.9
Fear scale Mean 9.33 43.72 12.75 3.18 14.68
CI95% 5.1; 13.6 39.5; 48.0 8.5; 17.0 −1.1; 7.4 10.4; 18.9
Sadness scale Mean CI95% 3.65 −0.6; 7.9 15.23 11.0; 19.5 12.51 8.2; 16.8 4.93 0.7; 9.2 41.82 37.6; 46.1
Anger scale Mean CI95% 54.61 50.3; 58.9 13.46 9.2; 17.7 11.32 8.5; 17.0 2.88 −2.7; 5.8 3.16 −1.2; 7.3
Neutral scale Mean CI95% 4.75 0.5; 9.0 3.97 −0.3; 8.2 3.66 −0.6; 7.9 37.64 33.4; 41.9 21.18 16.6; 25.4
Surprise scale Mean CI95% 13.97 9.7; 18.2 21.52* 17.3; 25.8 20.11 15.8; 24.4 14.66 10.4; 18.9 5.21 0.9; 9.8
scale CI95% −1.2; 7.2 −1.3; 7.2 34.9; 43.4 10.0; 18.5 −2.5; 5.2
Fear scale Mean 9.39 50.72 7.43 1.91 13.91
CI95% 5.1; 13.6 46.5; 55.0 3.2; 11.7 −2.3; 6.2 9.7; 18.2
Sadness scale Mean CI95% 4.45 0.2; 8.7 20.71 16.4; 25.0 10.78 6.5; 15.0 4.74 0.5; 9.0 46.96 42.7; 50.5
Anger scale Mean CI95% 54.52 50.3; 58.8 14.04 9.8; 18.3 11.32 7.1; 15.6 2.88 −1.4; 7.1 3.17 −1.1; 7.4
Neutral scale Mean CI95% 5.64 1.4; 9.9 4.28 0.0; 8.5 3.50 −0.8; 7.8 39.62 35.4; 43.9 21.16 16.9; 25.4
Surprise scale Mean CI95% 13.57 9.3; 17.8 11.17 6.9; 15.4 22.04 17.8; 26.3 16.20 11.9; 20.4 4.03 −0.2; 8.3
* Significant if p value below .05 (FDR corrected) in comparison with HC group.
et al., 1994) was used to assess alexithymia, and the Apathy Evaluation Scale (AES, Marin et al., 1991) to assess the potential presence of apathetic symptoms, a frequent finding in patients with cerebellar stroke (Villanueva, 2012).
hemispheric stroke. Exclusion criteria were 1) brainstem or occipital lesion (factor influencing clinical signs), 2) one or more other brain lesions, 3) diffuse and extensive white-matter disease, 4) other degenerative or inflammatory brain disease, 5) confusion or dementia, 6) major psychiatric disease, 7) the wearing of hearing aids or a history of tinnitus or a hearing impairment, as attested by the Montreal-Toulouse auditory agnosia battery (PEGA) (Agniel et al., 1992) (mean total score = 28.8, SE = 1.4, range = 27–30), 8) age below 18 years, and 9) major language comprehension deficits precluding reliable testing. All the tasks described below were designed to be highly feasible even for patients in clinical settings. The HC group consisted of eight men and seven women who were recruited from the general population and were given no reward for their participation. They had no history of neurological disorders, head trauma, anoxia, stroke or major cognitive deterioration, as attested by their score on the Mattis Dementia Rating Scale (MDRS, Mattis, 1988) (mean score = 141.7, SE = 1.7, range = 144-139). They were all French speakers with a mean age of 55.1 years (SE = 2.5, range = 30–65). According to the Edinburgh Handedness Inventory criteria (Oldfield, 1971), 12 HC participants were right-handed and three were left-handed. Their mean education level was 14.3 years (SE = 1.10, range = 11–19). As with the patient group, none of the HC wore hearing aids or had a history of tinnitus or a hearing impairment, as attested by the PEGA (mean score = 29.8, SE = 0.4, range = 29–30). The two groups were comparable for sex (z = −0.34, p = .73), age (z = 1.6, p = .10), education level (z = 0.90, p = .33), and handedness (z = −0.45, p = .65).
3.2. Vocal emotion recognition We used a validated emotional prosody recognition task that has proved to be relevant for studying the emotional effects of disorders such as major depression (Péron et al., 2011) or Parkinson’s disease (Péron et al., 2010) in clinical populations. A set of 60 vocal stimuli were extracted from a validated database (Banse and Scherer, 1996) consisting of meaningless speech (pseudowords), obtained by concatenating different syllables found in Indo-European languages so that they would be perceived as natural utterances, with emotional intonations common to different cultures but no semantic content. The vocal stimuli did not differ significantly in duration, mean acoustic energy expended, or standard deviation of the mean energy of the sounds (Péron et al., 2010). We selected utterances produced by 12 different actors (six women and six men), each expressed with five different prosodies (anger, fear, happiness, neutral, and sadness), resulting in 60 stimuli. Actor sex and identity were counterbalanced across the experimental conditions. This paradigm enabled us to collect dependent variables with visual continuous measures that did not induce categorization biases (Scherer and Ekman, 2008), and had the advantage of yielding information about possible confusions or misattributions. All stimuli were presented bilaterally through stereo headphones using an Authorware program. Participants were required to listen to each stimulus, after which they were asked to rate its emotional content on a set of visual analogue scales displayed simultaneously on a computer screen. More specifically, they were instructed to judge the extent to which different emotions were each expressed, by moving a cursor along a visual analogue scale ranging from No emotion expressed to Emotion expressed with exceptional intensity. Participants rated each stimulus on six scales: one scale for each emotion presented (anger, fear, happiness, and sadness) and one for the neutral utterance, plus a scale to rate the surprise emotion. We included a surprise scale to see whether the fear emotion expressed by the human voice can be confused with surprise, as is the case with facial expressions (Ekman, 2003; Scherer and Ellgring, 2007). No time limit was imposed on participants, who could replay each stimulus up to 6 times, but could not go back to modify their answers. An example of the computer interface used for the emotional prosody recognition task is provided in Appendix 1. The entire protocol was completed within a single session lasting approximately 90 min. To avoid a lengthier session, the two questionnaires (BDI-II and TAS-20) were handed out at the end of the
3. Methods 3.1. Clinical examination (Table 1) Prior to the emotional prosody assessment, all the patients were scored by a board certified neurologist (F.A.) on the Scale for the Assessment and Rating of Ataxia (SARA, Schmitz-Hubsch et al., 2006), which allows for a semi-quantitative assessment of cerebellar ataxia. A short neuropsychological and psychiatric battery was also administered to patients by either board certified neuropsychologists (J.P. and A.S.) or a trainee neuropsychologist (M.T.) supervised by the board certified neuropsychologists. This battery included the Montreal Cognitive Assessment (MOCA, Nasreddine and Patel, 2016) and a series of tests assessing executive functions: the Frontal Assessment Battery (FAB, Dubois et al., 2000), categorical and literal fluency tasks (Cardebat et al., 1990), and an action verb fluency task (Woods et al., 2005). Depression was assessed on the Beck Depression Inventory (BDI-II, Steer et al., 2001). The Toronto Alexithymia Scale (TAS-20, Bagby 4
Neuropsychologia 132 (2019) 107141
M. Thomasson, et al.
multiple regression VLSM. More specifically, by exploiting the natural variability in the presence versus absence of a given combination of symptoms in the same patients, then mapping the presence/absence of a lesion in a given brain area as a function of these symptoms, our approach combining principal component analysis and voxel-wise mapping allowed us to pinpoint the specific neural substrates associated with emotional deficits after brain lesions. The left and right patients were included in the same analysis. We did not flip the left lesions to the right in this analysis, in order to retain the ability to discern potential lateralization effects. Areas (minimum cluster size > 5 voxels) showing significant correlations with behavioural scores were identified using the FDR-corrected threshold of p = .001. The resulting statistics were mapped onto Montreal Neurological Institute standardized brain templates and colour coded.
session, together with a stamped addressed envelope, so that they could be completed by patients at home. 3.3. Standard protocol approvals, registration, and patient consent Written informed consent was obtained from each participant, the study met the ethical standards of the responsible committee on human experimentation, and was conducted in accordance with the Declaration of Helsinki. 3.4. Neuroradiological assessment All stroke patients underwent a detailed magnetic resonance imaging (MRI) protocol during their regular clinical work-up, including: standard T1-weighted and T2-weighted fluid-attenuated inversion-recovery (FLAIR) scans, diffusion-weighted imaging (DWI), plus perfusion-weighted imaging (PWI) whenever possible. These MRI image sequences are routinely performed for stroke assessment. FLAIR images provide high resolution and high sensitivity for acute cerebral infarcts, while DWI and PWI are more sensitive for the detection of early ischemic signs, with DWI showing high accuracy in predicting final lesion size, and PWI showing regions with acute hypoperfusion that do not necessarily become infarcted (Hillis et al., 2002).
4. Results 4.1. Clinical examination (Table 1) As shown in Table 1, none of the 15 patients displayed signs of ataxia (SARA scores between 0 and 7.5, 4 missing values). Concerning executive functioning, nine patients had abnormal FAB scores (< 16, 4 missing values), and nine had abnormal categorical fluency scores (z score < −1.6). Moreover, three patients had abnormal action verb fluency scores (z score < −1.6). As far as mood is concerned, none of the patients displayed signs of apathy (AES score < 18, 4 missing values). However, clinical examination showed that four of them had moderate depression (BDI-II score: 19–29, 1 missing value), and three had alexythimia (TAS-20 score > 61, 5 missing values).
3.5. Statistical analysis 3.5.1. Behavioural data For the vocal emotion recognition data, we performed two levels of analysis based on the same statistical model. First, we compared the performances of the patient versus HC groups. To do so, we used a generalized linear mixed model with emotion (5 levels) and scale (6 levels) as the within-participants variables, group (HC and stroke) as the between-participants variable, and participant as the random factor. Second, we ran contrasts between the groups for each prosodic category and each scale, based on the GLMM model using the phia package in R. This type of statistical model allows for the control of random effects such as inter-individual variability, in addition to fixed effects. Each p value yielded by the contrasts was false discovery rate (FDR) corrected. Moreover, we looked for correlations between the clinical and emotional data of the patient group using Spearman’s rank test. To avoid Type-I errors, we only included emotional variables that differed significantly between HC and patients.
4.2. Vocal emotion recognition (Table 2 and Suppl. Table) Interestingly, using a generalized linear mixed model (performed with emotion and scale as within-participants variables and group as a between-participants variable), analysis revealed a significant Group x Emotion x Scale interaction (F (20, 10712) = 2.69, p < .001), showing that the stroke and HC groups had different patterns of responses for different emotions. Other main and interaction effects were as follows: Emotion (F (4, 10712) = 11.10, p < .001), Scale (F (5, 10712) = 23.60, p < .001), Group (F (1, 26) = 0.09, p = .76), Group x Scale (F (5, 10712) = 2.89, p = .01), Emotion x Scale (F (20, 10712) = 234.57, p < .001), and Group x Emotion (F (4, 10712) = 0.32, p = .86). As some participants were left-handed, handedness has been added in the GLMM analysis as a factor. Its effect was not significant (F (1, 23) = 0.20, p = .66). Contrasts (performed between the groups for each prosodic category and each scale with FDR-corrected p value) revealed that the Group x Emotion x Scale interaction effect was driven by the fact that, compared with HC, patients provided higher ratings on the Surprise scale when they listened to fearful stimuli, (p = .04 corrected). The emotional prosody recognition data for each patient are provided in the Supplementary Table.
3.5.2. Cerebellar lesion-behaviour relationship We used a procedure previously described in the analysis of brain lesion-behaviour relationships (Saj et al., 2012; Verdon et al., 2010). To allow for clinical flexibility, MRI scans were performed anytime in the same week as the experimental task, and their timing was carefully recorded and taken into account in subsequent analyses (see for a similar protocol, Saj, et al., 2012; Verdon et al., 2010). Crucially, all the neuropsychological assessments took place in the same period, allowing for reliable lesion-symptom mapping. We selected the brain scans that showed the greatest extent of lesions in each patient. The location and extent of the brain damage was delineated in each of the patients, based on the MRI scans. For each patient, MRIcro software (www.mricro.com) was used by a trained neurologist (blinded to the patients’ performances) to delineate the extent and location of the brain damage on standardized brain templates. All the 3D lesion maps were entered into a MATLAB voxel-based lesion-symptom mapping (VLSM) analysis (Bates et al., 2003), together with the behavioural scores of interest (i.e., ratings that differed significantly between HC and patients). With this quantitative VLSM approach, numerical indices of performance were regressed in a voxel-byvoxel whole-brain SPM analysis, as in a previous work (Saj et al., 2012). Correlations with individual performance were explored using a
4.3. Behavioural correlations As mentioned in the Statistical analysis section, in order to avoid Type-I errors, we only included emotional variables that were reported to differ significantly between HC and patients (i.e., ratings on the Surprise scale when patients listened to fearful stimuli). We only found a significant correlation between patients' categorical fluency scores and their erroneous ratings on the Surprise scale for fearful prosody (r = 0.54, p = .04). All other correlations (between ratings on the Surprise scale for fearful prosody and SARA, MOCA, FAB, Act. Fluency, BDI-II, TAS-20 and AES scores) were nonsignificant (p > .05). We 5
Neuropsychologia 132 (2019) 107141
M. Thomasson, et al.
neuropsychiatric disorders and the cerebellum (Villanueva, 2012), we assessed depression, apathy and alexithymia in our patient group. Using diagnostic tools (psychiatric or neuropsychological) in so-called normal populations can be seen as methodologically and ethically questionable, which is why psychiatric traits were not assessed in HC. While patients did not display any signs of apathy, some did exhibit minor-tomoderate depression and alexithymia. We also observed executive functioning disorders. These results are not surprising, and are consistent with the well-known cerebellar cognitive and affective syndrome (Bodranghien et al., 2016; Pleger and Timmann, 2018). A scale was recently developed to assess this syndrome (Hoche et al., 2018), but we unfortunately collected our data before its publication. While subsequent analyses indicated that there was no significant correlation between mood variables (depression and alexithymia) and patients' emotional prosody recognition performances, we did find a significant correlation between categorical fluency and emotional recognition performances. We therefore ran a further analysis where the AIC and BIC showed that the prediction model was better if the categorical fluency scores were not included in the prediction model. Thus, the patients' dysexecutive functioning did not explain their judgments in the emotional prosody recognition task. Finally, the absence of ataxia in our patients also fits previous findings, as our patients did not have any anterior lobe (Lobule VI) lesions (Stoodley et al., 2016). As far as the emotional material and task are concerned, we used a validated emotional prosody recognition task (Péron et al., 2010), with controlled semantic content, relevant acoustic parameters, and speakers' gender (see Material and Methods section for a full description). Finally, we examined participants’ emotional ratings using a statistical method that took account of the specific distribution of the data. By taking this pattern into account, we were able to interpret the estimated effects for what they were, and not as artefacts arising from a misspecification of the actual structure of the data or a violation of the assumptions of the Gaussian distribution.
Fig. 1. VLSM results. Misattributions of surprise to fear stimuli correlated with lesions in right Lobules VIIb, VIII and IX (p < .001, FDR corrected, peak coordinates: x = 19, y = 70, z = −48). The colours indicate t-test values, ranging from black (non-significant) to white (maximum significance). The lesionsymptom maps were projected onto the ch2bet template of MRIcron®. (For interpretation of the references to colour in this figure legend, the reader is referred to the Web version of this article.)
therefore calculated the Akaike information criterion (AIC, Akaike, 1974) and Bayesian information criterion (BIC) to see whether the model containing the categorical fluency scores variable was of better quality than the model that did not contain it. Based on in-sample fit, the AIC is a refined technique for estimating the likelihood of a model predicting/estimating future values. The BIC is another criterion for model selection that evaluates the trade-off between model fit and model complexity. The lower the AIC or BIC value and the higher R2, the better the fit (Mohammed et al., 2015). Without the categorical fluency scores, the AIC was 49005.3 and the BIC was 49427.3 (R2 = 0.27). With the categorical fluency scores, the AIC was 49008.0 and the BIC was 49436.6 (R2 = 0.27). Accordingly, the prediction model was better if the categorical fluency scores were not included in the prediction model. The patients’ dysexecutive functioning therefore did not explain their judgments in the emotional prosody recognition task. As shown in Fig. 1, the VLSM analysis revealed that the significant emotional misattributions on the Surprise scale for fearful stimuli correlated with lesions in right Lobules VIIb, VIIIa,b and IX.
5.2. Functional role of the cerebellum in human vocal emotion decoding The present study highlighted two patterns of results that warrant discussion. First, we observed that the patients with cerebellar stroke displayed a deficit in the recognition of emotional prosody stimuli. This impairment was specific to fear, with a significant increase in misattributions (on the Surprise scale) when they listened to fearful stimuli. This result is partially congruent with the only previous group study to have investigated vocal emotion decoding in a similar clinical population (i.e., patients with focal cerebellar lesions due to ischaemic stroke, Adamaszek et al., 2014), as we found a disturbance in the recognition of one type of emotional prosody (fearful prosody). However this result is not congruent with a case study of a single patient with cerebellar ataxia (Heilman et al., 2014). This is not surprising, given the obvious differences between the two methodologies, notably regarding statistical power. In Adamaszek et al. (2014)’s group study, the effects were observed for all the emotions they tested (i.e., happiness, sadness, anger, and fear), whereas in the present study, recognition was only disturbed for fearful voices. Our finding is partially consistent with previous clinical and neuroimaging observations (in the facial modality) suggesting cerebellar involvement in negative emotions, which led to the conclusion that cerebellar structures are involved in recognizing and forwarding information about negative facial emotions to prefrontal and temporal areas (Ferrucci et al., 2012; Turner et al., 2007). More precisely, the deficit for fear is often reported in the literature as being related to the cerebellum’s crucial role in preparing motor and behavioural responses to emotional cues, thus facilitating adaptive behaviour in specific social situations (Adamaszek et al., 2014; Sacchetti et al., 2002). Our finding is also partially congruent with a functional transcranial Doppler sonographic study, in which increased mean blood flow velocity in the right middle cerebral arteries was
5. Discussion The aim of the present study was to explore the recognition of emotional prosody in patients with chronic cerebellar stroke. To achieve this, we explored vocal emotion recognition performances by comparing the ratings given by stroke and HC groups, using a validated emotional prosody recognition paradigm together with lesion-behaviour relationship analyses. Contrary to our hypotheses, patients did not exhibit a deficit in the recognition of all vocal emotions (no deficit was expected for the neutral prosody) compared with the HC group. Instead, we observed a significant difference between the stroke and HC groups according to scales and emotions, specifically driven by erroneous ratings on the Surprise scale when patients listened to fearful stimuli (Table 2). Moreover, the VLSM analysis revealed that these emotional misattributions correlated with lesions in the right Lobules VIIb, VIIIa,b and IX (Fig. 1). 5.1. Control tasks The patient and HC groups were randomly selected and matched for age, education level and sex, in order to avoid specific biases. All the participants were deemed to have no difficulty discriminating simple sounds that varied in intensity, pitch and timbre, as attested by performances on the PEGA (Agniel et al., 1992). Given the links between 6
Neuropsychologia 132 (2019) 107141
M. Thomasson, et al.
features, given that fearful and surprised vocalizations are known to have common acoustic features (e.g., very short duration and very high median F0). Pitch discrimination seems to be altered in patients with global cerebellar degeneration (Parsons et al., 2009) and, as we have already seen, Lobule VIIb is involved in rhythm discrimination (Konoike et al., 2012; O'Reilly et al., 2008). A lesion in this area could therefore disrupt the processing of spectral aspects, which is essential for the recognition of fear prosody. Finally, the present results, while underlying the role of the cerebellum in human emotion, also raise the question of the cerebellum’s functional specialization in affective processes (i.e., its differential role in comparison with the cerebral cortex, amygdala and basal ganglia). Current theories all suggest that the cerebellum plays the same computational role for cognitive functions as it does for motor ones. Emotional processes are not always tackled by theoreticians. When they are, the same principle is applied: an identical computational role, but one that is transcribed according to the function (motor, cognitive, affective). This computational role varies according to the authors, but there is general agreement that the cerebellum forms an internal model for the coordination of movement and thought (for reviews, see Caligiore et al., 2017; Koziol et al., 2014). Accordingly, and in order to generate adaptive behaviour, the cerebellum plays a role in error correction by generating the internal models used to make predictions. For example, the perception of speech and singing would be partly facilitated by internal simulation of vocal tract articulation, in order to predict and constrain the selection and acquisition of informative features of auditory stimulation (Callan et al., 2007). How this proposition translates into emotion processes remains to be conceptualized and operationalized, but we could infer that this function of internal model production is damaged in patients with cerebellar stroke and leads to the overproduction of errors. Moreover, based on the concept of forward models and prediction (Raymond and Medina, 2018; Sokolov et al., 2017), we can assume that there is an absence of updating in cortical associative areas, owing to a deficit in the prediction of changes in perceptual or mental states. However, this proposition does not explain why errors in our study were only made on the Surprise scale for fearful stimuli. That being said, as predictive modelling and error-based learning may be essential for social cognition, more studies are needed to investigate the cerebellum’s contribution to social predictions. Interestingly, it has also been suggested that the cerebellum regulates timing accuracy and temporal dynamics (Schmahmann, 2010). Evidence supporting cerebellar involvement in these processes comes from patients with cerebellar lesions. Typically, these patients display impaired judgements of the duration of auditory stimuli and the velocity of moving visual stimuli, and may exhibit severe distortions in duration-discrimination tasks (Ivry and Diener, 1991; Ivry and Keele, 1989). These results have been confirmed by functional imaging studies in healthy participants, suggesting that the cerebellum plays a critical role in the representation of temporal information (e.g., Spencer et al., 2007). This hypothesis has been investigated in speech, where results point to the cerebellum’s involvement in the temporal organization of the sound structure of verbal utterances. Taken together, these hypotheses raise the question of the cerebellum’s involvement in the processing of temporal dynamic information during emotional processes – that is, the extent to which cerebellar functional involvement in temporal processes can account for the emotional disorders observed in clinical populations and for cerebellar activation during emotional tasks.
observed in HC during negative emotion processing, but not in cerebellar patients (Lupo et al., 2015). Second, we observed that these vocal emotion misattributions correlated with lesions in the following structures: Lobule VIIb, Lobules VIIIa and b, and Lobule IX in the right cerebellar hemisphere. These results are partly congruent with our operational hypotheses. As expected, we found that the posterior lobules in the right side of the cerebellum correlated with the emotional prosody deficit in the patients with cerebellar stroke. This finding partially agrees with Schraa-Tam et al. (2012)’s fMRI results in HC, showing activation of the posterior cerebellum (i.e., Crus II, hemispheric Lobules VI and VIIa, and vermal Lobules VIII and IX) during the processing of emotional facial expressions. Interestingly, Lobule IX, which was activated during the presentation of negative stimuli in Schraa-Tam et al.’s study (2012), was linked to panic disorder in a patient with a specific lesion in this area (Schmahmann et al., 2007). Our results regarding right Lobule VIIb are particularly interesting, given our use of emotional material characterized by the modulation of the sound’s physical properties (e.g., amplitude, timing, and F0), as previous studies have reported an association between Lobule VII and rhythm discrimination (Konoike et al., 2012; O'Reilly et al., 2008). Contrary to our expectations, we did not find any significant correlations in the vermis or Crus I and II. To explain this discrepancy, it is important to note that our operational hypotheses were based solely on neuroanatomical animal studies and a meta-analysis (Stoodley and Schmahmann, 2009), because to the best of our knowledge, no neuroimaging study had previously explored the cerebellum’s functional specialization in vocal emotion decoding. The meta-analysis had concluded that the lateral posterior cerebellum should be viewed as being predominantly involved in cognitive processes, whereas the vermis may contribute to affective processing itself. While interesting, it was only a preliminary overview, and had several limitations inherent to the emotional material used in the studies it included, as explained in detail in the Introduction. As a consequence, even though the vermis may contribute to subjective feeling, it may not necessarily be involved in all the affective components of emotions, especially its motor components (e.g., in the recognition of vocal emotions). Over the coming years, more fine-grained functional mapping of the so-called affective cerebellum may well emerge. The apparent right hemispheric specialization of the cerebellum observed in the present study, together with the crossed cerebro-cerebellar connectivity, warrants discussion. As briefly mentioned in the Introduction, models describing the neural network subtending emotional prosody are still under debate, especially the question of hemispheric specialization. Current models (Schirmer and Kotz, 2006; Wildgruber et al., 2009) assume that emotional prosody recognition is a multistep process mediated by bilateral mechanisms. Different variables such as temporal resolution (Boemio et al., 2005), linguistic or paralinguistic aspects of speech (Schirmer and Kotz, 2006), attentional focus, and nature of vocalizations (Frühholz and Grandjean, 2013) would modulate hemispheric specialization at different cognitive stages of the process. In this context, the present results indicating right cerebellar specialization need to be explored further. More studies are required to address this question, manipulating the variables cited above. Moreover, we included five left-handed participants, and we know that manual laterality can influence language processing. Our statistical analysis indicated that handedness did not have a significant influence on participants' emotional prosody ratings. However, the literature shows that left-handed individuals display a relative lack of lateralization in motor function and language that is associated with a lack of lateralization in the processing of affective responses (Costanzo et al., 2015). In our study, language had only a limited influence on emotional prosody processing, as our stimuli were pseudosentences. Future studies could further reduce this impact by using onomatopoeias pronounced with emotional prosody (Belin et al., 2008). Moreover, it would be interesting to include surprise prosody, as Belin’s team did, in order to study the proportion of variance explained by shared acoustic
6. Conclusion Patients with cerebellar ischaemic stroke displayed a vocal emotion recognition deficit characterized by misattributions on the Surprise scale when they listened to fearful stimuli. Moreover, this deficit was correlated with lesions in right Lobules VIIb, VIIIa and b, and IX. These results point to the functional involvement of the posterior cerebellum 7
Neuropsychologia 132 (2019) 107141
M. Thomasson, et al.
Funding acquisition. Damien Benis: Software, Validation, Writing review & editing, Visualization. Didier Grandjean: Methodology, Validation, Data curation, Writing - review & editing. Frédéric Assal: Resources, Writing - review & editing, Project administration. Julie Péron: Conceptualization, Methodology, Software, Writing - original draft, Visualization, Supervision, Funding acquisition.
in vocal emotion decoding. Further studies are needed to better understand emotional and neurobehavioural changes in these patients, and to explore the specific role of the cerebellum in emotion recognition, as well as its potential hemispheric specialization. Disclosure The authors report no conflicts of interest.
Acknowledgements Conflicts of interest The study was carried out at the Neurology Department of the University Hospitals of Geneva (Prof. Kleinschmidt). The project was funded by Swiss National Foundation grant no. 105314_182221 (PI: Dr Julie Péron). The funders had no role in data collection, discussion of content, preparation of the manuscript, or decision to publish. We would like to thank the patients and healthy controls for contributing their time to this study. We are also grateful to Elizabeth Wiles-Portier for revising the English style.
The authors report no conflicts of interest. CRediT authorship contribution statement Marine Thomasson: Formal analysis, Investigation, Writing - original draft, Visualization. Arnaud Saj: Conceptualization, Methodology, Software, Writing - review & editing, Visualization,
Appendix. Computer interface for the original emotional prosody recognition paradigm
Note. Joie = Happiness, Peur = Fear, Tristesse = Sadness, Colère = Anger, Neutre = Neutral, Surprise = Surprise, Pas d'émotion exprimée = No expressed emotion, Intensité émotionnelle que l'on rencontre assez fréquemment = Emotional intensity we quite often encounter, Intensité émotionnelle exceptionnelle = Exceptional emotional intensity. Appendix A. Supplementary data Supplementary data to this article can be found online at https://doi.org/10.1016/j.neuropsychologia.2019.107141.
Scale–I. Item selection and cross-validation of the factor structure. J. Psychosom. Res. 38, 23–32. Banse, R., Scherer, K.R., 1996. Acoustic profiles in vocal emotion expression. J. Personal. Soc. Psychol. 70, 614–636. Bates, E., Wilson, S.M., Saygin, A.P., Dick, F., Sereno, M.I., Knight, R.T., Dronkers, N.F., 2003. Voxel-based lesion-symptom mapping. Nat. Neurosci. 6, 448–450. Belin, P., Fillion-Bilodeau, S., Gosselin, F., 2008. The Montreal Affective Voices: a validated set of nonverbal affect bursts for research on auditory affective processing. Behav. Res. Methods 40, 531–539. Berridge, K.C., 2003. Comparing the emotional brains of humans and other animals. In: Davidson, R.J., Scherer, K., Goldsmith, H.H. (Eds.), Handbook of Affective Sciences. Oxford University Press, Oxford, pp. 25–51. Bodranghien, F., Bastian, A., Casali, C., Hallett, M., Louis, E.D., Manto, M., Marien, P., Nowak, D.A., Schmahmann, J.D., Serrao, M., Steiner, K.M., Strupp, M., Tilikete, C., Timmann, D., van Dun, K., 2016. Consensus paper: revisiting the symptoms and signs
References Adamaszek, M., D'Agata, F., Kirkby, K.C., Trenner, M.U., Sehm, B., Steele, C.J., Berneiser, J., Strecker, K., 2014. Impairment of emotional facial expression and prosody discrimination due to ischemic cerebellar lesions. Cerebellum 13, 338–345. Agniel, A., Joanette, Y., Doyon, B., Duchein, C., 1992. Protocole d'évaluation des gnosies visuelles et auditives. Isbergues OrthoEditions (Chapter Chapter). Akaike, H., 1974. A new look at the statistical model identification. IEEE Trans. Autom. Control 19, 716–723. Anand, B.K., Malhotra, C.L., Singh, B., Dua, S., 1959. Cerebellar projections to limbic system. J. Neurophysiol. 22, 451–457. Badura, A., Verpeut, J.L., Metzger, J.W., Pereira, T.D., Pisano, T.J., Deverett, B., Bakshinskaya, D.E., Wang, S.S., 2018. Normal cognitive and social development require posterior cerebellar activity. Elife 7. Bagby, R.M., Parker, J.D., Taylor, G.J., 1994. The twenty-item Toronto Alexithymia
8
Neuropsychologia 132 (2019) 107141
M. Thomasson, et al.
Marin, R.S., Biedrzycki, R.C., Firinciogullari, S., 1991. Reliability and validity of the apathy evaluation scale. Psychiatry Res. 38, 143–162. Mattis, S., 1988. Dementia Rating Scale. Psychological Assessment Ressources Inc, Odessa, F.L (Chapter Chapter). Mohammed, A.A., Naugler, C., Far, B.H., 2015. Emerging business intelligence framework for a clinical laboratory through big data analytics. In: Emerging Trends in Computational Biology, Bioinformatics, and Systems Biology: Algorithms and Software Tools. Elsevier/Morgan Kaufmann, New York, pp. 577–602. Nasreddine, Z.S., Patel, B.B., 2016. Validation of montreal cognitive assessment, MoCA, alternate French versions. Can. J. Neurol. Sci. 43, 665–671. O'Reilly, J.X., Mesulam, M.M., Nobre, A.C., 2008. The cerebellum predicts the timing of perceptual events. J. Neurosci. 28, 2252–2260. Oldfield, R.C., 1971. The assessment and analysis of handedness: the Edinburgh inventory. Neuropsychologia 9, 97–113. Parsons, L.M., Petacchi, A., Schmahmann, J.D., Bower, J.M., 2009. Pitch discrimination in cerebellar patients: evidence for a sensory deficit. Brain Res. 1303, 84–96. Parvizi, J., Joseph, J., Press, D.Z., Schmahmann, J.D., 2007. Pathological laughter and crying in patients with multiple system atrophy-cerebellar type. Mov. Disord. 22, 798–803. Péron, J., El Tamer, S., Grandjean, D., Leray, E., Travers, D., Drapier, D., Vérin, M., Millet, B., 2011. Major depressive disorder skews the recognition of emotional prosody Progress in Neuro. Psychopharmacol. Biol. Psychiatry 35, 987–996. Péron, J., Frühholz, S., Vérin, M., Grandjean, D., 2013. Subthalamic nucleus: a key structure for emotional component synchronization in humans. Neurosci. Biobehav. Rev. 37, 358–373. Péron, J., Grandjean, D., Drapier, S., Vérin, M., 2014. Effect of dopamine therapy on nonverbal affect burst recognition in Parkinson's disease. PLoS One 9, e90092. Péron, J., Grandjean, D., Le Jeune, F., Sauleau, P., Haegelen, C., Drapier, D., Rouaud, T., Drapier, S., Vérin, M., 2010. Recognition of emotional prosody is altered after subthalamic nucleus deep brain stimulation in Parkinson's disease. Neuropsychologia 48, 1053–1062. Pidoux, L., Le Blanc, P., Levenes, C., Leblois, A., 2018. A subcortical circuit linking the cerebellum to the basal ganglia engaged in vocal learning. Elife 7. Pleger, B., Timmann, D., 2018. The role of the human cerebellum in linguistic prediction, word generation and verbal working memory: evidence from brain imaging, noninvasive cerebellar stimulation and lesion studies. Neuropsychologia 115, 204–210. Raymond, J.L., Medina, J.F., 2018. Computational principles of supervised learning in the cerebellum. Annu. Rev. Neurosci. 41, 233–253. Sacchetti, B., Baldi, E., Lorenzini, C.A., Bucherelli, C., 2002. Cerebellar role in fear-conditioning consolidation. Proc. Natl. Acad. Sci. U. S. A. 99, 8406–8411. Saj, A., Verdon, V., Vocat, R., Vuilleumier, P., 2012. 'The anatomy underlying acute versus chronic spatial neglect' also depends on clinical tests. Brain 135, e207 author reply e208. Sander, D., Grandjean, D., Pourtois, G., Schwartz, S., Seghier, M.L., Scherer, K.R., Vuilleumier, P., 2005. Emotion and attention interactions in social cognition: brain regions involved in processing anger prosody. Neuroimage 28, 848–858. Scherer, K.R., Ekman, P., 2008. Methodological issues in studying nonverbal behavior. In: Harrigan, J., Rosenthal, R., Scherer, K. (Eds.), The New Handbook of Methods in Nonverbal Behavior Research. Oxford University Press, Oxford, pp. 471–512. Scherer, K.R., Ellgring, H., 2007. Multimodal expression of emotion: affect programs or componential appraisal patterns? Emotion 7, 158–171. Schirmer, A., Kotz, S.A., 2006. Beyond the right hemisphere: brain mechanisms mediating vocal emotional processing. Trends Cogn. Sci. 10, 24–30. Schmahmann, J.D., 1998. Dysmetria of thought: clinical consequences of cerebellar dysfunction on cognition and affect. Trends Cogn. Sci. 2, 362–371. Schmahmann, J.D., 2010. The role of the cerebellum in cognition and emotion: personal reflections since 1982 on the dysmetria of thought hypothesis, and its historical evolution from theory to therapy. Neuropsychol. Rev. 20, 236–260. Schmahmann, J.D., Doyon, J., McDonald, D., Holmes, C., Lavoie, K., Hurwitz, A.S., Kabani, N., Toga, A., Evans, A., Petrides, M., 1999. Three-dimensional MRI atlas of the human cerebellum in proportional stereotaxic space. Neuroimage 10, 233–260. Schmahmann, J.D., Pandya, D.N., 1989. Anatomical investigation of projections to the basis pontis from posterior parietal association cortices in rhesus monkey. J. Comp. Neurol. 289, 53–73. Schmahmann, J.D., Pandya, D.N., 1990. Anatomical investigation of projections from thalamus to posterior parietal cortex in the rhesus monkey: a WGA-HRP and fluorescent tracer study. J. Comp. Neurol. 295, 299–326. Schmahmann, J.D., Pandya, D.N., 1991. Projections to the basis pontis from the superior temporal sulcus and superior temporal region in the rhesus monkey. J. Comp. Neurol. 308, 224–248. Schmahmann, J.D., Pandya, D.N., 1992. Course of the fiber pathways to pons from parasensory association areas in the rhesus monkey. J. Comp. Neurol. 326, 159–179. Schmahmann, J.D., Pandya, D.N., 1993. Prelunate, occipitotemporal, and parahippocampal projections to the basis pontis in rhesus monkey. J. Comp. Neurol. 337, 94–112. Schmahmann, J.D., Pandya, D.N., 1995. Prefrontal cortex projections to the basilar pons in rhesus monkey: implications for the cerebellar contribution to higher function. Neurosci. Lett. 199, 175–178. Schmahmann, J.D., Pandya, D.N., 1997. Anatomic organization of the basilar pontine projections from prefrontal cortices in rhesus monkey. J. Neurosci. 17, 438–458. Schmahmann, J.D., Sherman, J.C., 1998. The cerebellar cognitive affective syndrome. Brain 121 (Pt 4), 561–579. Schmahmann, J.D., Weilburg, J.B., Sherman, J.C., 2007. The neuropsychiatry of the cerebellum - insights from the clinic. Cerebellum 6, 254–267. Schmitz-Hubsch, T., du Montcel, S.T., Baliko, L., Berciano, J., Boesch, S., Depondt, C., Giunti, P., Globas, C., Infante, J., Kang, J.S., Kremer, B., Mariotti, C., Melegh, B.,
of cerebellar syndrome. Cerebellum 15, 369–391. Boemio, A., Fromm, S., Braun, A., Poeppel, D., 2005. Hierarchical and asymmetric temporal sensitivity in human auditory cortices. Nat. Neurosci. 8, 389–395. Bostan, A.C., Dum, R.P., Strick, P.L., 2010. The basal ganglia communicate with the cerebellum. Proc. Natl. Acad. Sci. U. S. A 107, 8452–8456. Bostan, A.C., Dum, R.P., Strick, P.L., 2013. Cerebellar networks with the cerebral cortex and basal ganglia. Trends Cogn. Sci. 17 (5), 241–254. https://doi.org/10.1016/j.tics. 2013.03.003. Bostan, A.C., Strick, P.L., 2010. The cerebellum and basal ganglia are interconnected. Neuropsychol. Rev. 20, 261–270. Caligiore, D., Pezzulo, G., Baldassarre, G., Bostan, A.C., Strick, P.L., Doya, K., Helmich, R.C., Dirkx, M., Houk, J., Jorntell, H., Lago-Rodriguez, A., Galea, J.M., Miall, R.C., Popa, T., Kishore, A., Verschure, P.F., Zucca, R., Herreros, I., 2017. Consensus paper: towards a systems-level view of cerebellar function: the interplay between cerebellum, basal ganglia, and cortex. Cerebellum 16, 203–229. Callan, D.E., Kawato, M., Parsons, L., Turner, R., 2007. Speech and song: the role of the cerebellum. Cerebellum 6, 321–327. Cardebat, D., Doyon, B., Puel, M., Goulet, P., Joanette, Y., 1990. Formal and semantic lexical evocation in normal subjects. Performance and dynamics of production as a function of sex, age and educational level. Acta Neurol. Belg. 90, 207–217. Costanzo, E.Y., Villarreal, M., Drucaroff, L.J., Ortiz-Villafane, M., Castro, M.N., Goldschmidt, M., Wainsztein, A.E., Ladron-de-Guevara, M.S., Romero, C., Brusco, L.I., Camprodon, J.A., Nemeroff, C., Guinjoan, S.M., 2015. Hemispheric specialization in affective responses, cerebral dominance for language, and handedness: lateralization of emotion, language, and dexterity. Behav. Brain Res. 288, 11–19. Delplanque, S., N'Diaye, K., Scherer, K.R., Grandjean, D., 2007. Spatial frequencies or emotional effects? A systematic measure of spatial frequencies for IAPS pictures by a discrete wavelet analysis. J. Neurosci. Methods 165, 144–150. Dempesy, C.W., Tootle, D.M., Fontana, C.J., Fitzjarrell, A.T., Garey, R.E., Heath, R.G., 1983. Stimulation of the paleocerebellar cortex of the cat: increased rate of synthesis and release of catecholamines at limbic sites. Biol. Psychiatry 18, 127–132. Diedrichsen, J., Balsters, J.H., Flavell, J., Cussans, E., Ramnani, N., 2009. A probabilistic MR atlas of the human cerebellum. Neuroimage 46, 39–46. Dubois, B., Slachevsky, A., Litvan, I., Pillon, B., 2000. The FAB: a frontal assessment battery at bedside. Neurology 55, 1621–1626. Ekman, P., 2003. Darwin, deception, and facial expression. Ann. N. Y. Acad. Sci. 1000, 205–221. Ferrucci, R., Giannicola, G., Rosa, M., Fumagalli, M., Boggio, P.S., Hallett, M., Zago, S., Priori, A., 2012. Cerebellum and processing of negative facial emotions: cerebellar transcranial DC stimulation specifically enhances the emotional recognition of facial anger and sadness. Cognit. Emot. 26, 786–799. Frühholz, S., Ceravolo, L., Grandjean, D., 2012. Specific brain networks during explicit and implicit decoding of emotional prosody. Cerebr. Cortex 22, 1107–1117. Frühholz, S., Grandjean, D., 2012. Towards a fronto-temporal neural network for the decoding of angry vocal expressions. Neuroimage 62, 1658–1666. Frühholz, S., Grandjean, D., 2013. Processing of emotional vocalizations in bilateral inferior frontal cortex. Neurosci. Biobehav. Rev. 37, 2847–2855. Fusar-Poli, P., Placentino, A., Carletti, F., Landi, P., Allen, P., Surguladze, S., Benedetti, F., Abbamonte, M., Gasparotti, R., Barale, F., Perez, J., McGuire, P., Politi, P., 2009. Functional atlas of emotional faces processing: a voxel-based meta-analysis of 105 functional magnetic resonance imaging studies. J. Psychiatry Neurosci. 34, 418–432. Grandjean, D., Sander, D., Pourtois, G., Schwartz, S., Seghier, M.L., Scherer, K.R., Vuilleumier, P., 2005. The voices of wrath: brain responses to angry prosody in meaningless speech. Nat. Neurosci. 8, 145–146. Heilman, K.M., Leon, S.A., Burtis, D.B., Ashizawa, T., Subramony, S.H., 2014. Affective communication deficits associated with cerebellar degeneration. Neurocase 20, 18–26. Hillis, A.E., Wityk, R.J., Barker, P.B., Beauchamp, N.J., Gailloud, P., Murphy, K., Cooper, O., Metter, E.J., 2002. Subcortical aphasia and neglect in acute stroke: the role of cortical hypoperfusion. Brain 125, 1094–1104. Hoche, F., Guell, X., Vangel, M.G., Sherman, J.C., Schmahmann, J.D., 2018. The cerebellar cognitive affective/Schmahmann syndrome scale. Brain 141, 248–270. Hull, C., 2018. The cerebellum influences vocal timing. Elife 7. Ivry, R.B., Diener, H.C., 1991. Impaired velocity perception in patients with lesions of the cerebellum. J. Cogn. Neurosci. 3, 355–366. Ivry, R.B., Keele, S.W., 1989. Timing functions of the cerebellum. J. Cogn. Neurosci. 1, 136–152. Juslin, P.N., Scherer, K., 2005. Vocal expression of affect. In: Harrigan, J., Rosenthal, R., Scherer, K. (Eds.), The New Handbook of Methods in Nonverbal Behavior Research. Oxford University Press, Oxford. Konoike, N., Kotozaki, Y., Miyachi, S., Miyauchi, C.M., Yomogida, Y., Akimoto, Y., Kuraoka, K., Sugiura, M., Kawashima, R., Nakamura, K., 2012. Rhythm information represented in the fronto-parieto-cerebellar motor system. Neuroimage 63, 328–338. Kotz, S.A., Meyer, M., Alter, K., Besson, M., von Cramon, D.Y., Friederici, A.D., 2003. On the lateralization of emotional prosody: an event-related functional MR investigation. Brain Lang. 86, 366–376. Koziol, L.F., Budding, D., Andreasen, N., D'Arrigo, S., Bulgheroni, S., Imamizu, H., Ito, M., Manto, M., Marvel, C., Parker, K., Pezzulo, G., Ramnani, N., Riva, D., Schmahmann, J., Vandervert, L., Yamazaki, T., 2014. Consensus paper: the cerebellum's role in movement and cognition. Cerebellum 13, 151–177. Lupo, M., Troisi, E., Chiricozzi, F.R., Clausi, S., Molinari, M., Leggio, M., 2015. Inability to process negative emotions in cerebellar damage: a functional transcranial Doppler sonographic study. Cerebellum 14, 663–669. Manto, M., 2010. Physiology of the cerebellum. In: Manto, M. (Ed.), Cerebellar Disorders: A Practical Approach to Diagnosis and Management. Cambridge University Press, Cambridge.
9
Neuropsychologia 132 (2019) 107141
M. Thomasson, et al.
831–844. Turner, B.M., Paradiso, S., Marvel, C.L., Pierson, R., Boles Ponto, L.L., Hichwa, R.D., Robinson, R.G., 2007. The cerebellum and emotional experience. Neuropsychologia 45, 1331–1341. Verdon, V., Schwartz, S., Lovblad, K.O., Hauert, C.A., Vuilleumier, P., 2010. Neuroanatomy of hemispatial neglect and its functional components: a study using voxel-based lesion-symptom mapping. Brain 133, 880–894. Vilensky, J.A., van Hoesen, G.W., 1981. Corticopontine projections from the cingulate cortex in the rhesus monkey. Brain Res. 205, 391–395. Villanueva, R., 2012. The cerebellum and neuropsychiatric disorders. Psychiatry Res. 198, 527–532. Voogd, J., Glickstein, M., 1998. The anatomy of the cerebellum. Trends Neurosci. 21, 370–375. Wager, T., Feldman Barrett, L., Bliss-Moreau, E., Lindquist, K., Duncan, S., Kober, E., Joseph, J., Davidson, M., Mize, J., 2008. The neuroimaging of emotion. In: Lewis, M., Haviland-Jones, J., Feldman Barrett, L. (Eds.), Handbook of Emotions, 3d ed. The Guilford Press, New York London, pp. 249–271. Wildgruber, D., Ethofer, T., Grandjean, D., Kreifelts, B., 2009. A cerebral network model of speech prosody comprehension. Int. J. Speech Lang. Pathol. 11, 277–281. Woods, S.P., Scott, J.C., Sires, D.A., Grant, I., Heaton, R.K., Troster, A.I., 2005. Action (verb) fluency: test-retest reliability, normative standards, and construct validity. J. Int. Neuropsychol. Soc. 11, 408–415. Ziegler, W., Ackermann, H., 2017. Subcortical contributions to motor speech: phylogenetic, developmental, clinical. Trends Neurosci. 40, 458–468.
Pandolfo, M., Rakowicz, M., Ribai, P., Rola, R., Schols, L., Szymanski, S., van de Warrenburg, B.P., Durr, A., Klockgether, T., Fancellu, R., 2006. Scale for the assessment and rating of ataxia: development of a new clinical scale. Neurology 66, 1717–1720. Schraa-Tam, C.K., Rietdijk, W.J., Verbeke, W.J., Dietvorst, R.C., van den Berg, W.E., Bagozzi, R.P., De Zeeuw, C.I., 2012. fMRI activities in the emotional cerebellum: a preference for negative stimuli and goal-directed behavior. Cerebellum 11, 233–245. Sokolov, A.A., Miall, R.C., Ivry, R.B., 2017. The cerebellum: adaptive prediction for movement and cognition. Trends Cogn. Sci. 21, 313–332. Spencer, R.M., Verstynen, T., Brett, M., Ivry, R., 2007. Cerebellar activation during discrete and not continuous timed movements: an fMRI study. Neuroimage 36, 378–387. Steer, R.A., Brown, G.K., Beck, A.T., Sanderson, W.C., 2001. Mean Beck Depression Inventory-II scores by severity of major depressive episode. Psychol. Rep. 88, 1075–1076. Stirnimann, N., N'Diaye, K., Jeune, F.L., Houvenaghel, J.F., Robert, G., Drapier, S., Drapier, D., Grandjean, D., Verin, M., Peron, J., 2018. Hemispheric specialization of the basal ganglia during vocal emotion decoding: evidence from asymmetric Parkinson's disease and (18)FDG PET. Neuropsychologia 119, 1–11. Stoodley, C.J., MacMore, J.P., Makris, N., Sherman, J.C., Schmahmann, J.D., 2016. Location of lesion determines motor vs. cognitive consequences in patients with cerebellar stroke. Neuroimage Clin 12, 765–775. Stoodley, C.J., Schmahmann, J.D., 2009. Functional topography in the human cerebellum: a meta-analysis of neuroimaging studies. Neuroimage 44, 489–501. Stoodley, C.J., Schmahmann, J.D., 2010. Evidence for topographic organization in the cerebellum of motor control versus cognitive and affective processing. Cortex 46,
10