Restored speech comprehension linked to activity in left inferior prefrontal and right temporal cortices in postlingual deafness

Restored speech comprehension linked to activity in left inferior prefrontal and right temporal cortices in postlingual deafness

www.elsevier.com/locate/ynimg NeuroImage 31 (2006) 842 – 852 Restored speech comprehension linked to activity in left inferior prefrontal and right t...

394KB Sizes 0 Downloads 7 Views

www.elsevier.com/locate/ynimg NeuroImage 31 (2006) 842 – 852

Restored speech comprehension linked to activity in left inferior prefrontal and right temporal cortices in postlingual deafness Malene Vejby Mortensen,a,b,c,* Frank Mirz,b and Albert Gjedde a,c a

PET Center, Aarhus University Hospital, 44 Norrebrogade, Aarhus 8000, Denmark ENT Department, Aarhus University Hospital, Aarhus 8000, Denmark c Center of Functionally Integrative Neuroscience, Aarhus University, Aarhus, Denmark b

Received 21 July 2005; revised 1 November 2005; accepted 16 December 2005 Available online 3 February 2006

The left inferior prefrontal cortex (LIPC) is involved in speech comprehension by people who hear normally. In contrast, functional brain mapping has not revealed incremental activity in this region when users of cochlear implants comprehend speech without silent repetition. Functional brain maps identify significant changes of activity by comparing an active brain state with a presumed baseline condition. It is possible that cochlear implant users recruited alternative neuronal resources to the task in previous studies, but, in principle, it is also possible that an aberrant baseline condition masked the functional increase. To distinguish between the two possibilities, we tested the hypothesis that activity in the LIPC characterizes high speech comprehension in postlingually deaf CI users. We measured cerebral blood flow changes with positron emission tomography (PET) in CI users who listened passively to a range of speech and non-speech stimuli. The pattern of activation varied with the stimulus in users with high speech comprehension, unlike users with low speech comprehension. The high-comprehension group increased the activity in prefrontal and temporal regions of the cerebral cortex and in the right cerebellum. In these subjects, single words and speech raised activity in the LIPC, as well as in left and right temporal regions, both anterior and posterior, known to be activated in speech recognition and complex phoneme analysis in normal hearing. In subjects with low speech comprehension, sites of increased activity were observed only in the temporal lobes. We conclude that increased activity in areas of the LIPC and right temporal lobe is involved in speech comprehension after cochlear implantation. D 2005 Elsevier Inc. All rights reserved. Keywords: Cochlear implants; Hearing; Left inferior prefrontal cortex; PET; Speech comprehension

Introduction Language processing in normally hearing individuals is associated with extensive frontal activation in the left hemisphere, including the left inferior prefrontal cortex (LIPC) (Binder et al., * Corresponding author. Fax: +45 8949 4400. E-mail address: [email protected] (M.V. Mortensen). Available online on ScienceDirect (www.sciencedirect.com). 1053-8119/$ - see front matter D 2005 Elsevier Inc. All rights reserved. doi:10.1016/j.neuroimage.2005.12.020

1997). Anatomically, the LIPC is a large and heterogeneous region (Roland, 1993), which largely coincides with the anterior (BA 45 and 47) and posterior (BA 44 and 45) parts of the left inferior frontal gyrus, the posterior part often referred to as Broca’s area (Poldrack et al., 1999; Amunts et al., 1999). Tradition has assigned a mainly expressive language function to this area, but functional specialization of the LIPC has undergone recent extensions beyond its conventional role in speech generation. Several studies show a relationship between the perception of language and left frontal activity, both when the presentation of stimuli is auditory (Demonet et al., 1992; Binder et al., 1997; Davis and Johnsrude, 2003) and when it is visual (Price et al., 1996; Friederici et al., 2000; Seghier et al., 2004). Although the precise location of activity within the frontal cortex varies with the demands of the task, blood flow maps show that the left inferior frontal cortex is incrementally active during phonological processing of speech sounds (Brodmann’s Areas [BA] 44 and 45), semantic generation and decision (BA 44, 45, and 47), lexical and reading tasks (BA 44), and the simple viewing of words (BA 44 and 47; for review, see Poldrack et al. (1999)). Furthermore, in functional mapping of passive story-listening, left temporal activations extend upwards to include the inferior part of the prefrontal cortex (Papathanassiou et al., 2000). Unlike the involvement of LIPC in normal hearing, the role of this part of the brain in speech comprehension is not established in patients with cochlear implants. The multi-channel cochlear implant (CI) prosthesis enables profoundly hearing-impaired subjects to receive and process sounds that stimulate the cochlea by means of electrodes implanted in the inner ear. Patients with a cochlear implant generally acquire a degree of speech comprehension. However, studies of passive listening with a CI do not confirm incremental activity in LIPC during speech comprehension (Wong et al., 1999; Giraud et al., 2000). Thus, no previous publication has shown activation of this area when CI users engage in passive listening without silent repetition. The reason for the discrepancy between normally hearing individuals and patients with CI with respect to functional activity during speech comprehension is unknown. Two possible explanations exist;

M.V. Mortensen et al. / NeuroImage 31 (2006) 842 – 852

either CI users develop the ability to recruit brain regions other than LIPC during speech comprehension or the functional neuroanatomical correlates reported in previous studies of CI users were determined with contrasts that were not optimal for the determination of this correlation. One finding in favor of the latter alternative is the lack of activity in Broca’s area (BA 44 and 45) also in normally hearing individuals in these studies (Wong et al., 1999; Giraud et al., 2000). Additional explanations could be the inclusion of CI users with a mixture of pre- and postlingual deafness (Herzog et al., 1991; Naito et al., 1997) or with greatly differing performance (Naito et al., 1995, 2000; Parving et al., 1995; Wong et al., 1999) as left inferior frontal activity in normally hearing subjects has been demonstrated to correlate with the intelligibility of spoken language (Davis and Johnsrude, 2003). The present study tested CI users with no apparent reason for the differential success of the implantation. In keeping with the known role of LIPC in speech comprehension in normal hearing, we tested the hypothesis that the LIPC is incrementally active in cochlear implant users engaged in successful speech comprehension, while no such activity is present in cochlear implant users with poor language comprehension.

843

(Ludvigsen, 1974) and a mean age of 48.4 years (range 35 – 64 years). Five subjects had the implant on the left side, while two had it on the right side. The low-comprehension group (‘‘Low’’ group) included five patients, two women and three men, with poor speech perception (less than 60% open-set score) and a mean age of 48.8 years (range 32 – 61 years). One subject had the implant on the left side, and four had it on the right side. One subject had recovered from meningitis without neurological sequelae other than hearing loss. No significant differences of duration of pre-implant deafness or implant use existed between the groups ( P > 0.4, Mann – Whitney test). Speech comprehension was assessed by the standard Helen sentence test, in which subjects are presented with 25 simple, different sentences in question form in their native language and scored on the number of sentences to which they can respond (Teig et al., 1992; Kei et al., 2000). With the exception of one patient in the Low group, subjects suffered mild tinnitus when exposed to silence but otherwise were not affected by this condition. The clinical details of the subjects are listed in Table 1. No technical, peripheral or external reasons for poor CI use were apparent. The protocol, to which all subjects gave the required informed written consent according to the latest Declaration of Helsinki, was approved by the official County Aarhus Research Ethics Committee.

Materials and methods Stimuli Subjects We divided 12 postlingually deaf adults into two groups according to the success of the cochlear implantation. Restricted availability of patients with unexplained poor speech comprehension allowed us to scan only a limited number of subjects. All subjects had multi-channel cochlear implants (Table 1). The highcomprehension group (‘‘High’’ group) included seven patients, four men and three women, with excellent open-set speech perception (96 – 100% score in a standard open-set test without lip-reading)

To test for functional brain activity at different levels of stimulus complexity, each subject underwent positron emission tomography (PET) in five conditions of speech, speech-like and non-speech stimulation: (1) silent baseline with subjects asked to attend either to their tinnitus (10 of 11 had mild tinnitus in this situation) or to the low background hum of the tomograph; (2) white noise; (3) multitalker babble from multiple simultaneous speakers with a complexity close to that of running speech and perceived by the listeners as speech-like but devoid of meaning (Icra, 1997); (4)

Table 1 Cochlear implant subjects: clinical profile Group

Age/ Implant

Side of implant

Etiology of deafness

High comprehension 1 M 55 2 F 49 3 M 50 4 F 38 5 F 64 6 M 35 7 M 48

51 46 49 33 60 33 38

R R L L L L L

PSNHLb COd PSNHL PSNHL PSNHL PSNHL Meningitis

Low comprehension 8 M 9 F 10 F 11 M 12 M

59 35 53 29 54

R L R R R

PSNHL PSNHL PSNHL PSNHL PSNHL

a b c d e f

Gender

Age/ PET

61 40 56 32 55

In a standard speech recognition test (Ludvigsen, 1974). Progressive Sensory Neural Hearing Loss. Cochlear Ltd., Australia. Cochlear otosclerosis. Advanced Bionics Corp., California. Out of 8 possible.

Profound deafness (years) 1.5 5 1 2 5 3 4

2 2 9 12 1

Implant use (years)

Implant

Functional electrodes

Open-set score (%)a

Lip-reading score (%)

3.5 2.5 1.5 5 3.5 1.5 10

Nucleusc Nucleus Nucleus Nucleus Nucleus Nucleus Nucleus

20 19 20 20 20 19 20

100 100 96 100 96 100 100

44 74 40 64 92 28 68

1.5 5 3 3 0.6

Nucleus Clarione Nucleus Nucleus Nucleus

17 8f 20 20 20

60 56 56 12 60

20 32 60 68 52

844

M.V. Mortensen et al. / NeuroImage 31 (2006) 842 – 852

monosyllabic and unrelated words presented with an interval of 2 s; and (5) ‘‘running’’ speech, narrating the history of a familiar geographical locality at the rate of 142 words per minute. The stimuli were recorded on compact disc and delivered directly from a computer to the external input port of the implant speech processor. Words and running speech were generated in Danish by a standard female voice (Elberling et al., 1989), and all stimuli were presented at the most comfortable level (MCL). To define the MCL, subjects were exposed to the different stimuli once before the tomography. In the scanner, prior to injection, they had no information of the nature of the next stimulus but were instructed to listen attentively (including to the tinnitus or the hum of the tomograph at the baseline).

corrected for multiple comparisons according to Gaussian Field Theory). Second, to determine the changes of blood flow in the Low group, at sites generated by group analysis in the High group, region of interest (ROI) analyses were performed in the Low group by a simple t test of the magnitude of change from zero. Activity was measured in a sphere (radius 7 mm) around the x, y and z stereotaxic coordinates corresponding to the local maxima of the High group and normalized by subtracting the mean activity in the scan.

PET data acquisition

Behavioral data

We measured raised or reduced cerebral activity as the change of the brain uptake of oxygen-15-labeled water, which matches the distribution of cerebral blood flow, using an ECAT exact HR 47 PETomograph (Siemens/CTI). Emission scans were initiated at 60,000 true counts per second after repeated i.v. bolus injections of doses of tracer with an activity of 500 MBq (13.5 mCi). Activity decayed for 12 min before each new tomography session. The tomography took place in a darkened room with subjects’ eyes closed. The silent baseline and running speech conditions were duplicated, generating a total of seven tomography sessions. In one subject of the Low group, white noise stimulation was not obtained. Auditory stimulation commenced 10 s before the injection of the tracer and lasted throughout the frame (90 s). Each frame consisted of 47 3.1 mm slices. After correction for scatter and measured attenuation correction, each PET frame was reconstructed with filtered backprojection and smoothed with a post-reconstruction 10 mm Gaussian filter resulting in a resolution of 11 mm FWHM. At the time of planning this study, MRI was not considered safe for cochlear implants. Thus, all PET images were realigned and co-registered to the average of 85 individual MR scans, non-linearly warped to the Talairach atlas (average subject age 30 years, range 20 – 60 years), using Automatic Image Registration (AIR) software (Woods et al., 1992) to correct for head movements between scans and to anatomically locate sites of increased rCBF. To smooth the PET images for individual anatomical differences and variation in gyral anatomy, images were further smoothed with a Gaussian filter resulting in final 14 mm FWHM isotropic resolution. The PET-MRI data sets were resampled in the standardized stereotaxic coordinate system of Talairach and Tournoux (Talairach and Tournoux, 1988) (voxel sizes 1.34 mm by 1.72 mm by 1.5 mm in the x, y and z planes, respectively).

One subject in the High group (subject 3 in Table 1) experienced minor difficulty perceiving the running speech, of which no more than 75% was recognized. No other subject in the High group had this difficulty. Subjects described babble as an unknown foreign language (several subjects suggested Russian). White noise was heard as noise in both groups. Subjects in the Low group could not unequivocally distinguish babble from running speech, but they did discriminate a few words when they listened to monosyllabic words and running speech.

Statistical analyses of blood flow changes Using the bolus H215O methodology (Raichle et al., 1983) without arterial blood sampling, the relative distribution of rCBF was measured in baseline and activated conditions. The significance of blood flow changes was determined in two complementary ways. First, the presence of statistically significant focal changes in rCBF was tested by calculating the t statistic using a pooled standard deviation after voxel-by-voxel subtraction of PET volumes (Worsley et al., 1992). By searching the entire cerebral cortex, t values equal to or exceeding 4.3 were considered representative of significant focal changes in rCBF ( P < 0.05,

Results

Group analysis of raised activity Relative to the silent baseline, subjects in the High group reacted to white noise by raising activity bilaterally in primary auditory cortex (PAC). Listening to babble raised the activity more laterally in the superior temporal gyri (Fig. 1). Meaningful words raised the activity in posterior parts of the superior temporal gyrus (STG) bilaterally, as well as in the right anterior STG, right posterior middle temporal gyrus and LIPC. Compared to babble or white noise, running speech raised the activity in LIPC as well as in the posterior STG bilaterally, with extension antero-ventrally to the anterior part of the temporal lobe, including the left temporal pole (BA 38) (Figs. 2 and 3). Furthermore, the right cerebellum was activated. In speech minus babble, the most significant activation was in the anterior part of the left STG (Fig. 2). Relative to baseline, running speech raised activity bilaterally in the middle part of the STG, right cerebellum, left superior frontal gyrus (BA 6) and precentral gyrus, but no significant change was observed in the LIPC in this contrast. However, region of interest (ROI) analysis at the coordinates 45, 20, 7 (x, y and z mm) revealed the significant change shown in Fig. 4. Thus, LIPC activity was present in all relevant contrasts (speech and words vs. baseline, speech vs. white noise and speech vs. babble). The Low group did not demonstrate significant increase of activity when exposed to white noise, but white noise data were available in only four subjects. In the Low group, babble raised the activity bilaterally in the STG (Fig. 1) lateral to the PAC and in the anterior part of the right STG. Words elicited further significant elevation of the activity at the same coordinates, as did running speech in contrast to baseline and to white noise. Running speech compared to babble raised the activity in an area more posterior in the left STG, at the coordinates listed in Table 2. The increase of activity in the left temporal lobe in the Low group exposed to running speech compared to babble is shown in Fig. 2, where it is contrasted with the marked change seen in the High group.

M.V. Mortensen et al. / NeuroImage 31 (2006) 842 – 852

845

Fig. 1. Babble minus baseline contrast reveals common activations of superior temporal gyrus bilaterally in both groups (top panel, High group; bottom panel, Low group).

ROI analyses of raised activity in Low group Using ROIs identified in the High group, we determined the magnitude of response in each subject of the Low group and performed simple t tests of the average change from zero, as shown for three different contrasts in Fig. 5: the t test of the magnitude of

change at a site (PAC) identical to the site of white noise activation in the High group revealed significant blood flow increase in the left PAC and borderline activity in the right PAC. Borderline LIPC activation in the Low group was found in the words minus baseline contrast. ROI analysis of the speech minus babble contrast in the Low group revealed that responses did not differ significantly from

Fig. 2. Activation of superior temporal lobe and left inferior prefrontal cortex in response to running speech, relative to multitalker babble, differs between the groups of CI users with high and low comprehension, as shown on the 3D rendition of sites of increased activity. Top panels: high-comprehension group. Bottom panels: low-comprehension group. Note that the increment at the site of right temporal activity in the Low group is insignificant. Insert: Transaxial and co-registered PET image of increased activity in left temporal plane of the Low group.

846

M.V. Mortensen et al. / NeuroImage 31 (2006) 842 – 852

Fig. 3. The rCBF increase accompanying recognition and comprehension of speech (running speech minus multitalker babble). (Left) Group analysis reveals incremental activity of left hemisphere extending forward into the inferior frontal lobe (arrows). (Right) Bar diagram demonstrates results of individual analyses of relative change of rCBF at coordinates 45, 20, and 4.

zero in the LIPC and right temporal lobe, the cerebellum and left anterior STG. Group analysis of reduced activity In all contrasts, except speech minus white noise, the High group had decreases in left or right cerebellum, or both. The Low group only had decreases in cerebellum in the speech minus white noise contrast. In the High group, other sites of significant decrease in all contrasts, except white noise minus baseline, included the extrastriate cortex. Decreases were also present in the left superior parietal and the posterior cingulate region, when the subtraction was running speech minus babble; the right middle frontal cortices when running speech was contrasted to either babble or baseline; and right middle temporal gyrus and parahippocampal gyrus when

Fig. 4. High group images of speech vs. baseline contrast (left panel, top and bottom) and simple t test of speech minus baseline increment from zero (right panel). Histogram shows activity at LIPC coordinates 45, 20, 4 (x, y and z mm) identified in speech vs. babble contrast (open bar) and applied to speech vs. baseline contrast (filled bar). The absence of significant difference between the increments of the two contrasts is indicated by the probability of a random difference ( P = 0.98).

contrasted to white noise. The Low group also had decreases in extrastriate cortex when subjects listened to speech and babble and additionally in the anterior region of the temporal lobe when running speech was compared to babble, while the babble stimulus relative to baseline reduced activity in the left medial frontal gyrus. Words compared to baseline decreased the activity in the left medial occipitotemporal gyrus. These coordinates are all listed in Table 3.

Discussion Unlike normally hearing individuals, patients with cochlear implants generally fail to show the left anterior perisylvian activation expected in brain imaging studies of word or speech comprehension, unless silent repetition is also required. In contrast, in the present study, we found that the LIPC indeed is incrementally active during speech comprehension by implantees. Furthermore, the temporal activity pattern demonstrated for speech and non-speech stimuli in the present study converges with previous studies of normal hearing. In contrast to the High group, stimulation with white noise did not raise activity in the anatomical location of the PAC (Rivier and Clarke, 1997) in the group analysis of the low comprehenders. The absence of a change could reflect either a generally weaker signal in this group, or the inclusion of only four subjects in the white noise condition, as significant increase in blood flow was found on the left side at coordinates defined by the High group (Fig. 5). If the location of the PAC is defined as in three CI users in the study of Seghier et al. (2005), it is also possible that the activity observed lateral to the coordinates obtained from the white noise condition in all the other conditions minus baseline represents functional PAC as this region may be activated in response to all auditory stimuli. The study revealed important differences between two groups of cochlear implant users with low and high speech comprehension, respectively, in the imaging of responses to speech and nonspeech stimuli. Unlike the low-comprehension group, subjects in the high-comprehension group had increased activity in cortical regions known to be involved in language processing in normally hearing individuals (Binder et al., 1997; Papathanassiou et al.,

M.V. Mortensen et al. / NeuroImage 31 (2006) 842 – 852

847

Table 2 Regions of blood flow increase Talairach coordinates x High comprehension White noise – Baselinea Babble – Baseline Words – Baseline

Speech – Babble

Speech – White noisea

Speech – baseline

Low comprehension White noise – baselinea Babble – baseline

Words – baseline

Speech – babble Speech – white noisea

Speech – baseline

y

t value

Area

BA

41 41 42/22 42/22 42/22 22 22 21 22 44/45 22 22 38 22 22 44/45 44

z

48 46 59 60 55 59 63 68 56 45 56 56 40 55 58 45 50 30 60 58 38 55 60 46 25 58 60 47 9 9

21 21 21 21 21 38 26 35 5 15 1 24 13 5 31 20 15 71 26 11 12 11 25 17 80 23 14 6 10 91

3 8 5 5 2 5 5 15 6 4 7 0 22 4 0 7 13 19 2 2 22 3 3 4 24 2 2 42 58 21

4.6 4.5 7.1 6.8 8.8 6.9 9.8 4.7 6.3 5.1 9.8 9.2 5.4 5.8 4.7 5.7 4.2 6 12.6 11.5 4.7 9.7 7.9 6.3 5.8 19.8 15.2 5.5 4.9 5

Left superior temporal gyrus Right superior temporal gyrus Left superior temporal gyrus Right superior temporal gyrus Left superior temporal gyrus Left superior temporal gyrus (posterior) Right superior temporal gyrus (posterior) Right middle temporal gyrus (posterior) Right superior temporal gyrus (anterior) LIPC Left superior temporal gyrus (anterior) Left superior temporal gyrus (posterior) Left temporal pole Right superior temporal gyrus (anterior) Right superior temporal gyrus (posterior) LIPC LIPC Right cerebellum Left superior temporal gyrus (posterior) Left superior temporal gyrus (anterior) Left temporal pole Right superior temporal gyrus (anterior) Right superior temporal gyrus (posterior) LIPC Right cerebellum Left superior temporal gyrus Right superior temporal gyrus Left precentral gyrus Superior frontal gyrus Right cerebellum

62 63 52 63 64 50 55 64 67 56 60 64

18 21 1 20 28 6 38 26 16 1 23 14

5 6 0 5 9 6 5 3 0 9 5 2

5.3 5 5.6 7.1 5.6 5.6 4.8 6.3 6.4 5.2 6.6 5.5

No significant activity Left superior temporal gyrus Right superior temporal gyrus Right superior temporal gyrus (anterior) Left superior temporal gyrus Right superior temporal gyrus Right superior temporal gyrus (anterior) Left superior temporal gyrus (posterior) Left superior temporal gyrus (posterior) Right superior temporal gyrus Right superior temporal gyrus (anterior) Left superior temporal gyrus Right superior temporal gyrus

22 22 38 22 22 44/45 22 22 4 6

42/22 42/22 22 42/22 42/22 22 22 22 22 22 42/22 22

Significant peak activations in groups of high- and low-comprehension cochlear implant users. a Note that, in the low-comprehension group, white noise data were only obtained in 4 subjects.

2000). Three regions in particular emerged as active during high speech comprehension. First, left inferior prefrontal activity increased as predicted. Recruitment of this left frontal region appears to depend on specific tuning of phonetic analysis in speech processing as the region is not activated in all auditory comprehension tasks (Demonet et al., 1992; Zatorre et al., 1996; Hickok and Poeppel, 2000; Noppeney and Price, 2002). Early studies revealed contributions of activity in the left frontal cortex to the execution of phonetic tasks (Zatorre et al., 1992, 1996). In a recent study in which subjects searched for phonetic

cues in initially incomprehensible speech masked by complex noise, the activity in Broca’s area increased after practice, implying recruitment of this area in response to increased numbers of recognizable cues of information or as a result of search for meaning (Giraud et al., 2004). The latter interpretation finds support in a study of hierarchical organization of the processes involved in speech comprehension, in which left inferior frontal gyrus activity was correlated with speech intelligibility. Insensitivity to the degree of acoustic distortion of the sentences suggests that frontal area activity represents more abstract non-acoustic processing (Davis and Johnsrude, 2003).

848

M.V. Mortensen et al. / NeuroImage 31 (2006) 842 – 852

Fig. 5. Simple t tests of Low group increments from zero (filled bars) in three sets of contrasts, showing presence or absence of significant change in each set, at sites in the Low group identified by significant change in the High group (open bars). Contrasts include white noise minus baseline evaluated in primary auditory cortex (PAC), in which low comprehenders show no activation in the right hemisphere; words minus baseline evaluated in the LIPC, in which low comprehenders show borderline activation; and speech minus babble evaluated in the LIPC, anterior (left and right) and posterior (right) STG, and right cerebellum, at which sites no significant activations were identified in the Low group.

It has been shown that subjects distinguish a phoneme only when 75 – 100% of the formant information (frequency content) is present and that this occurs in association with changes of neuronal activity in the left hemisphere (Rinne et al., 1999). The findings may demonstrate that phonetic percepts in the neural networks of

the left hemisphere, including regions ranging from the superior temporal lobe to the inferior frontal cortex, depend on sufficient acoustic (phonetic) information, which may not be available to an extent, that gives rise to frontal activity in the Low group. Left frontal areas are recruited in the process of segmentation of speech sounds (Burton et al., 2000), and the LIPC may mediate matching to internal representations of phonemes in spoken language (Binder et al., 2004). According to the hypothesis of internal representations, an implantee would fail to comprehend speech if the segmented language elements of the speech signal either provide no fit to an internal representation or cannot be routed to the region executing the fits. It is true that the location of the absent functionality responsible for the inactivity of specific language processing regions of the brains of low-comprehenders cannot be known with certainty, but it must reside proximal to the observed sites of activation, which extend beyond the PAC into the left posterior STG, when speech is contrasted with babble (Table 2). The activity in the lateral temporal cortex was shown to be unrelated to linguistic analysis but rather to be associated with the speech signal per se (Binder et al., 2000; Vouloumanos et al., 2001). Thus, it appears that low-comprehension subjects recognize the acoustic properties of speech (as they do understand a few words) but lack the capacity for sufficient linguistic analysis to extract coherent meaning. The design of the present study does not allow us to identify which of the language processes the activity actually reflects as the LIPC is known also to be involved in semantic generation and decision making (Tzourio et al., 1998; Papathanassiou et al., 2000; Devlin et al., 2003) and in episodic memory formation (Kohler et al., 2004). Left inferior frontal (BA45) activity is present in both auditory and visual semantic tasks, whereas simultaneous left posterior temporal and right-sided cerebellar activations seem specific to the auditory modality (Chee et al., 1999). The LIPC in collaboration with more posterior temporal regions is suggested to be actively involved in joint processing of multiple types of information (Gold and Buckner, 2002) and to serve a unification role in language processing (Hagoort, 2005). Second, activation of anterior and posterior regions in the STG of the right hemisphere by the speech minus babble contrast occurred only in the High group. This finding may reflect the fact that five of the seven subjects were implanted on the left side. However, neither the side of auditory presentation nor the number of active electrodes had a significant effect on the brain regions activated in previous studies (Naito et al., 2000; Giraud et al., 2001). The right temporal lobe previously has been demonstrated to be more extensively activated in CI users (Wong et al., 1999; Giraud et al., 2000; Naito et al., 2000). In the High group, it is possible that right superior temporal areas are recruited by a more complex cognitive processing (Demonet et al., 1992); that low-level phonological processing, classically involving posterior right STG, is overly activated (Petersen et al., 1988; Price et al., 1992; Fiez et al., 1996); or that additional resources are needed to complete the initial analysis of sounds by a damaged auditory system as the right superior temporal region appears to relate to increasing demands of information processing (Price et al., 1992). Finally, the right temporal activity could be an effect of the monaural stimulation since both hemispheres have the capacity for prelexical processing (Scheffler et al., 1998; Mummery et al., 1999). Right temporal regions are also active during phoneme and word recognition (Demonet et al., 1992; Pedersen et al., 2000).

M.V. Mortensen et al. / NeuroImage 31 (2006) 842 – 852

849

Table 3 Regions of blood flow decrease Talairach coordinates x High comprehension White noise – baselinea Babble – baseline

Words – baseline

Speech – babble

Speech – white noisea

Speech – baseline

Low comprehension White noise – baselinea Babble – baseline

Words – baseline Speech – babble Speech – white noisea

Speech – baseline

y

t value

Area

BA

z

17 12 25 35 28 23 31 19 17 5 34 4 4 19 4 1 20 16 46 20 5 1 20 11 38

74 69 68 83 74 92 78 77 88 40 24 66 44 80 83 61 87 57 54 86 67 59 81 68 30

24 27 25 23 12 20 23 23 20 48 44 39 18 21 13 40 20 9 20 20 21 44 21 26 42

6 4.5 6.5 5.6 4.6 5.5 7 5.7 6.6 6.2 5.2 5.4 5.1 5.3 5.8 4.9 5.6 5.2 5.1 8.7 7.8 8.1 7.8 6.9 4.9

Right cerebellum Left cerebellum Right cerebellum Left lateral occipitotemporal gyrus Right lateral occipitotemporal gyrus Left lingual gyrus Right cerebellum Left cerebellum Left lingual gyrus Left superior parietal lobule Right middle frontal gyrus Right precuneus Right cingulate region Left cerebellum Left cuneus Right precuneus Left lingual gyrus Right parahippocampal gyrus Right middle temporal gyrus Left lingual gyrus Right cuneus Right precuneus Left cerebellum Right cerebellum Right middle frontal gyrus

68 60 5 0 1 3 15 47 25 4 13 16 7

21 47 74 85 57 15 42 20 87 47 37 76 74

20 12 9 17 18 18 9 21 18 15 24 5 21

4.1 ns 3.7 ns 5.8 5.2 5.7 5 4.7 4.8 5.3 5.9 4.8 4.8 4.68

Left middle temporal gyrus Right middle temporal gyrus Left lingual gyrus Left cuneus Left precuneus Left medial frontal gyrus Left medial occipitotemporal gyrus Left temporal pole Left lingual gyrus Right cerebellum Right fronto-orbital gyrus Left lingual gyrus Right cuneus

19 19 18

18 7 8 7 29/30 18 7 18 30 39 18 31 7

8

20 21 18 18 23 25 19 38 18 11 18 31

ns = not significant. Significant peak deactivations in groups of high- and low-comprehension cochlear implant users. a Note that, in the low-comprehension group, white noise data were only obtained in 4 subjects.

Both the area involved in the perception of words, and the more anterior area involved only in complex phoneme analysis, became more active when high comprehenders listened to monosyllabic words and running speech. This anterior area may play a role in early tentative phoneme analysis; it was active when the subjects of the Low group listened to babble in contrast to baseline and to speech in contrast to white noise. Third, significant activity appeared in the left anterior STG in addition to the more posterior temporal activity when running speech was contrasted to babble or white noise, or words to baseline. These activations are similar to observations in normally hearing subjects during passive listening, associated with the processing of intelligible speech (Scott et al., 2000; Narain et al., 2003), but particularly the right anterior STG has been reported to be involved in speech comprehension (Giraud et al., 2004).

Exposure to speech consistently raises activity bilaterally in the posterior part of the STG of normally hearing individuals (Hickok and Poeppel, 2000). However, the result of subtracting non-speech from speech in the High group appears to be extensive bilateral activity, which agrees with previously reported foci of increased temporal activity in implantees compared to normally hearing subjects (Wong et al., 1999; Naito et al., 2000; Giraud et al., 2001), with the salient exception of activity in the left anterior STG. Cerebellum Blood flow increased significantly in the right cerebellar cortex of the High group when running speech was comprehended relative to babble, as listed in Table 2. Lateralized cerebellar

850

M.V. Mortensen et al. / NeuroImage 31 (2006) 842 – 852

activations also appeared in early studies of word generation (Raichle et al., 1994) and speech comprehension (Binder et al., 1997; Papathanassiou et al., 2000), the latter believed to imply non-motor, i.e. cognitive, work of the cerebellum in language processing subserving verbal working memory (Desmond et al., 1997). However, speech perception is impaired in patients with cerebellar dysfunction (Ackermann et al., 1997, 1999), and new evidence (Mathiak et al., 2002) demonstrates a distinct contribution of the right cerebellar hemisphere (at almost the same coordinates as in the present study) to precise representation of temporal information for phonetic processing. Temporal coding is important to speech information in CI users with limited place coding (Moller, 1999; Kral, 2000), but it cannot be ruled out that cerebellar activity in the High group is caused by semantic processing (Chee et al., 1999; Roskies et al., 2001; Noppeney and Price, 2002). Deactivations Activity decreased in extrastriate regions in the subjects of both groups. Based on the lip-reading competence of the subjects (Table 1), we speculate that this decrease represents a down-regulation of activity normally stimulated when subjects recruit the modality of vision to the task of perceiving sound (Calvert et al., 1997). All subjects had received intensive training in lip-reading, resulting in a median lip-reading competence of 55%, which no normally hearing individual can match without training. Extrastriate regions regulate activity in visual cortex by backprojection from V5 to V1 (Pascual-Leone and Walsh, 2001); as the superior parietal region is recruited by directed visual attention (Corbetta et al., 1998), its deactivation may reflect missing visual clues. We observed decreased activity in several regions thought to subserve attention (right middle frontal gyrus, precuneus and cingulate region) in subjects of the High group in the speech minus babble contrast, most likely because of the decreased default activity in these regions engaged in the goal-oriented task of comprehending babble in this group (Gusnard et al., 2001). In subjects of the Low group perceiving the most demanding stimuli, the decrease in the left temporal pole (BA 38), a region implicated in semantics (Hodges and Patterson, 1997; Noppeney and Price, 2002), points to task-induced reallocation of processing resources from areas of task-induced deactivations to areas involved in task performance (McKiernan et al., 2003). Limitations The contingencies of this study necessarily presented the authors with limitations of subject selection, study design, methods and results, which affect the interpretations. The cochlear implants ruled out the use of individual MR images for anatomical coregistration. As the study involved two groups of subjects with different speech comprehension, it could have been of interest to make direct comparisons between the groups. However, the study was intended to test the brain activation associated with speech comprehension and thus, in principle, did not strongly focus on subjects with poor speech comprehension. Although the presence of the latter subjects made a direct comparison possible in principle, the different numbers of subjects would make it errorprone. To rule out the possibility that low comprehenders had borderline activations that did not quite reach significance at the coordinates of significant High group activations but could have

done so if additional low comprehenders had been included, we evaluated the individual magnitudes of change in the low comprehenders by simple t tests. If the low comprehenders individually had variable changes in the direction of activation, it is possible that addition of more subjects would have led to greater significance. The results of the simple t tests support the claim that low speech comprehension is associated with absent increase of blood flow in LIPC when running speech is contrasted with babble and that the different group sizes had no influence on the strength of this result. Recent analyses of brain function increasingly consider the interactions among networks of neuronal ensembles (Raichle et al., 2001; Koechlin et al., 2003). In the light of this revision, it could be said to be a limitation that the present study focused on the incremental activity at discrete sites identified by group analysis of the cerebral cortex. However, the focus of the present study was the unexplained lack of LIPC activation in speech comprehension after cochlear implantation. This focus is not at variance with the general position that absent incremental activity does not signify lack of brain work at these places nor that the incremental activity is no more important or relevant than the underlying baseline activity.

Conclusion The present PET study tested brain activation patterns in two groups of postlingually deaf adults listening to speech and nonspeech stimuli. The groups were identical with respect to preimplantation data but differed in their post-implantation speech perception. Complex input signals generated by the cochlear implant reached the auditory cortex in both groups. Unlike good speech comprehenders, poor comprehenders of running speech engaged no right temporal or left inferior prefrontal regions relative to babble, although they did have activity posteriorly in the left superior temporal lobe. The study clearly demonstrates that speech processing recruits regions of the LIPC, in addition to the primary auditory cortex and posterior STG, in cochlear implant users with restored speech comprehension.

Acknowledgments This research was supported by The National Association of Hearing Impaired in Denmark, Desire´e and Niels Ydes Foundation and the Danish National Research Foundation’s Center of Functionally Integrative Neuroscience. We appreciate the help of Anders B. Rodell, PhD, with average MRI images and Stig Madsen with sound stimuli.

References Ackermann, H., Graber, S., Hertrich, I., Daum, I., 1997. Categorical speech perception in cerebellar disorders. Brain Lang. 60, 323 – 331. Ackermann, H., Graber, S., Hertrich, I., Daum, I., 1999. Cerebellar contributions to the perception of temporal cues within the speech and nonspeech domain. Brain Lang. 67, 228 – 241. Amunts, K., Schleicher, A., Burgel, U., Mohlberg, H., Uylings, H.B., Zilles, K., 1999. Broca’s region revisited: cytoarchitecture and intersubject variability. J. Comp. Neurol. 412, 319 – 341.

M.V. Mortensen et al. / NeuroImage 31 (2006) 842 – 852 Binder, J.R., Frost, J.A., Hammeke, T.A., Cox, R.W., Rao, S.M., Prieto, T., 1997. Human brain language areas identified by functional magnetic resonance imaging. J. Neurosci. 17, 353 – 362. Binder, J.R., Frost, J.A., Hammeke, T.A., Bellgowan, P.S., Springer, J.A., Kaufman, J.N., Possing, E.T., 2000. Human temporal lobe activation by speech and nonspeech sounds. Cereb. Cortex 10, 512 – 528. Binder, J.R., Liebenthal, E., Possing, E.T., Medler, D.A., Ward, B.D., 2004. Neural correlates of sensory and decision processes in auditory object identification. Nat. Neurosci. 7, 295 – 301. Burton, M.W., Small, S.L., Blumstein, S.E., 2000. The role of segmentation in phonological processing: an fMRI investigation. J. Cogn. Neurosci. 12, 679 – 690. Calvert, G.A., Bullmore, E.T., Brammer, M.J., Campbell, R., Williams, S.C., McGuire, P.K., Woodruff, P.W., Iversen, S.D., David, A.S., 1997. Activation of auditory cortex during silent lipreading. Science 276, 593 – 596. Chee, M.W., O’Craven, K.M., Bergida, R., Rosen, B.R., Savoy, R.L., 1999. Auditory and visual word processing studied with fMRI. Hum. Brain Mapp. 7, 15 – 28. Corbetta, M., Akbudak, E., Conturo, T.E., Snyder, A.Z., Ollinger, J.M., Drury, H.A., Linenweber, M.R., Petersen, S.E., Raichle, M.E., Van Essen, D.C., Shulman, G.L., 1998. A common network of functional areas for attention and eye movements. Neuron 21, 761 – 773. Davis, M.H., Johnsrude, I.S., 2003. Hierarchical processing in spoken language comprehension. J. Neurosci. 23, 3423 – 3431. Demonet, J.F., Chollet, F., Ramsay, S., Cardebat, D., Nespoulous, J.L., Wise, R., Rascol, A., Frackowiak, R., 1992. The anatomy of phonological and semantic processing in normal subjects. Brain 115 (Pt. 6), 1753 – 1768. Desmond, J.E., Gabrieli, J.D., Wagner, A.D., Ginier, B.L., Glover, G.H., 1997. Lobular patterns of cerebellar activation in verbal workingmemory and finger-tapping tasks as revealed by functional MRI. J. Neurosci. 17, 9675 – 9685. Devlin, J.T., Matthews, P.M., Rushworth, M.F., 2003. Semantic processing in the left inferior prefrontal cortex: a combined functional magnetic resonance imaging and transcranial magnetic stimulation study. J. Cogn. Neurosci. 15, 71 – 84. Elberling, C., Ludvigsen, C., Lyregaard, P.E., 1989. DANTALE: a new Danish speech material. Scand. Audiol. 18, 169 – 175. Fiez, J.A., Raichle, M.E., Balota, D.A., Tallal, P., Petersen, S.E., 1996. PET activation of posterior temporal regions during auditory word presentation and verb generation. Cereb. Cortex 6, 1 – 10. Friederici, A.D., Opitz, B., von Cramon, D.Y., 2000. Segregating semantic and syntactic aspects of processing in the human brain: an fMRI investigation of different word types. Cereb. Cortex 10, 698 – 705. Giraud, A.L., Truy, E., Frackowiak, R.S., Gregoire, M.C., Pujol, J.F., Collet, L., 2000. Differential recruitment of the speech processing system in healthy subjects and rehabilitated cochlear implant patients. Brain 123 (Pt. 7), 1391 – 1402. Giraud, A.L., Price, C.J., Graham, J.M., Frackowiak, R.S., 2001. Functional plasticity of language-related brain areas after cochlear implantation. Brain 124, 1307 – 1316. Giraud, A.L., Kell, C., Thierfelder, C., Sterzer, P., Russ, M.O., Preibisch, C., Kleinschmidt, A., 2004. Contributions of sensory input, auditory search and verbal comprehension to cortical activity during speech processing. Cereb. Cortex 14, 247 – 255. Gold, B.T., Buckner, R.L., 2002. Common prefrontal regions coactivate with dissociable posterior regions during controlled semantic and phonological tasks. Neuron 35, 803 – 812. Gusnard, D.A., Akbudak, E., Shulman, G.L., Raichle, M.E., 2001. Medial prefrontal cortex and self-referential mental activity: relation to a default mode of brain function. Proc. Natl. Acad. Sci. U. S. A. 98, 4259 – 4264. Hagoort, P., 2005. On Broca, brain, and binding: a new framework. Trends Cogn. Sci. 9, 416 – 423. Herzog, H., Lamprecht, A., Kuhn, A., Roden, W., Vosteen, K.H., Feinendegen, L.E., 1991. Cortical activation in profoundly deaf patients

851

during cochlear implant stimulation demonstrated by H2(15)O PET. J. Comput. Assist. Tomogr. 15, 369 – 375. Hickok, G., Poeppel, D., 2000. Towards a functional neuroanatomy of speech perception. Trends Cogn. Sci. 4, 131 – 138. Hodges, J.R., Patterson, K., 1997. Semantic memory disorders. Trends Cogn. Sci. 1, 68 – 72. Icra. ICRA noise signals ver.0.3. 1997. International collegium of rehabilitative audiology. Ref Type: Sound Recording Kei, J., Smyth, V., Murdoch, B., McPherson, B., 2000. Measuring the understanding of sentences by hearing-impaired children: comparison with connected discourse ratings. Audiology 39, 38 – 49. Koechlin, E., Ody, C., Kouneiher, F., 2003. The architecture of cognitive control in the human prefrontal cortex. Science 302, 1181 – 1185. Kohler, S., Paus, T., Buckner, R.L., Milner, B., 2004. Effects of left inferior prefrontal stimulation on episodic memory formation: a two-stage fMRI-rTMS study. J. Cogn. Neurosci. 16, 178 – 188. Kral, A., 2000. Temporal code and speech recognition. Acta Oto-Laryngol. 120, 529 – 530. Ludvigsen, C., 1974. Construction and evaluation of an audio – visual test, the Helen-test. Scand. Audiol., Suppl. 3, 67 – 75. Mathiak, K., Hertrich, I., Grodd, W., Ackermann, H., 2002. Cerebellum and speech perception: a functional magnetic resonance imaging study. J. Cogn. Neurosci. 14, 902 – 912. McKiernan, K.A., Kaufman, J.N., Kucera-Thompson, J., Binder, J.R., 2003. A parametric manipulation of factors affecting task-induced deactivation in functional neuroimaging. J. Cogn. Neurosci. 15, 394 – 408. Moller, A.R., 1999. Review of the roles of temporal and place coding of frequency in speech discrimination. Acta Oto-Laryngol. 119, 424 – 430. Mummery, C.J., Ashburner, J., Scott, S.K., Wise, R.J., 1999. Functional neuroimaging of speech perception in six normal and two aphasic subjects. J. Acoust. Soc. Am. 106, 449 – 457. Naito, Y., Okazawa, H., Honjo, I., Hirano, S., Takahashi, H., Shiomi, Y., Hoji, W., Kawano, M., Ishizu, K., Yonekura, Y., 1995. Cortical activation with sound stimulation in cochlear implant users demonstrated by positron emission tomography. Brain Res. Cogn. Brain Res. 2, 207 – 214. Naito, Y., Hirano, S., Honjo, I., Okazawa, H., Ishizu, K., Takahashi, H., Fujiki, N., Shiomi, Y., Yonekura, Y., Konishi, J., 1997. Sound-induced activation of auditory cortices in cochlear implant users with post- and prelingual deafness demonstrated by positron emission tomography. Acta Oto-Laryngol. 117, 490 – 496. Naito, Y., Tateya, I., Fujiki, N., Hirano, S., Ishizu, K., Nagahama, Y., Fukuyama, H., Kojima, H., 2000. Increased cortical activation during hearing of speech in cochlear implant users. Hear. Res. 143, 139 – 146. Narain, C., Scott, S.K., Wise, R.J., Rosen, S., Leff, A., Iversen, S.D., Matthews, P.M., 2003. Defining a left-lateralized response specific to intelligible speech using fMRI. Cereb. Cortex 13, 1362 – 1368. Noppeney, U., Price, C.J., 2002. A PET study of stimulus- and task-induced semantic processing. NeuroImage 15, 927 – 935. Papathanassiou, D., Etard, O., Mellet, E., Zago, L., Mazoyer, B., TzourioMazoyer, N., 2000. A common language network for comprehension and production: a contribution to the definition of language epicenters with PET. NeuroImage 11, 347 – 357. Parving, A., Christensen, B., Salomon, G., Pedersen, C.B., Friberg, L., 1995. Regional cerebral activation during auditory stimulation in patients with cochlear implants. Arch. Otolaryngol., Head Neck Surg. 121, 438 – 444. Pascual-Leone, A., Walsh, V., 2001. Fast backprojections from the motion to the primary visual area necessary for visual awareness. Science 292, 510 – 512. Pedersen, C.B., Mirz, F., Ovesen, T., Ishizu, K., Johannsen, P., Madsen, S., Gjedde, A., 2000. Cortical centres underlying auditory temporal processing in humans: a PET study. Audiology 39, 30 – 37. Petersen, S.E., Fox, P.T., Posner, M.I., Mintun, M., Raichle, M.E., 1988. Positron emission tomographic studies of the cortical anatomy of singleword processing. Nature 331, 585 – 589.

852

M.V. Mortensen et al. / NeuroImage 31 (2006) 842 – 852

Poldrack, R.A., Wagner, A.D., Prull, M.W., Desmond, J.E., Glover, G.H., Gabrieli, J.D., 1999. Functional specialization for semantic and phonological processing in the left inferior prefrontal cortex. NeuroImage 10, 15 – 35. Price, C., Wise, R., Ramsay, S., Friston, K., Howard, D., Patterson, K., Frackowiak, R., 1992. Regional response differences within the human auditory cortex when listening to words. Neurosci. Lett. 146, 179 – 182. Price, C.J., Wise, R.J., Frackowiak, R.S., 1996. Demonstrating the implicit processing of visually presented words and pseudowords. Cereb. Cortex 6, 62 – 70. Raichle, M.E., Martin, W.R., Herscovitch, P., Mintun, M.A., Markham, J., 1983. Brain blood flow measured with intravenous H2(15)O: II. Implementation and validation. J. Nucl. Med. 24, 790 – 798. Raichle, M.E., Fiez, J.A., Videen, T.O., MacLeod, A.M., Pardo, J.V., Fox, P.T., Petersen, S.E., 1994. Practice-related changes in human brain functional anatomy during nonmotor learning. Cereb. Cortex 4, 8 – 26. Raichle, M.E., MacLeod, A.M., Snyder, A.Z., Powers, W.J., Gusnard, D.A., Shulman, G.L., 2001. A default mode of brain function. Proc. Natl. Acad. Sci. U. S. A. 98, 676 – 682. Rinne, T., Alho, K., Alku, P., Holi, M., Sinkkonen, J., Virtanen, J., Bertrand, O., Naatanen, R., 1999. Analysis of speech sounds is lefthemisphere predominant at 100 – 150 ms after sound onset. NeuroReport 10, 1113 – 1117. Rivier, F., Clarke, S., 1997. Cytochrome oxidase, acetylcholinesterase, and NADPH-diaphorase staining in human supratemporal and insular cortex: evidence for multiple auditory areas. NeuroImage 6, 288 – 304. Roland, P.E., 1993. The frontal lobes and limbic system. Brain Activation. Wiley-Liss, Inc., pp. 341 – 364. Roskies, A.L., Fiez, J.A., Balota, D.A., Raichle, M.E., Petersen, S.E., 2001. Task-dependent modulation of regions in the left inferior frontal cortex during semantic processing. J. Cogn. Neurosci. 13, 829 – 843. Scheffler, K., Bilecen, D., Schmid, N., Tschopp, K., Seelig, J., 1998. Auditory cortical responses in hearing subjects and unilateral deaf patients as detected by functional magnetic resonance imaging. Cereb. Cortex 8, 156 – 163. Scott, S.K., Blank, C.C., Rosen, S., Wise, R.J., 2000. Identification of a

pathway for intelligible speech in the left temporal lobe. Brain 123 (Pt. 12), 2400 – 2406. Seghier, M.L., Lazeyras, F., Pegna, A.J., Annoni, J.M., Zimine, I., Mayer, E., Michel, C.M., Khateb, A., 2004. Variability of fMRI activation during a phonological and semantic language task in healthy subjects. Hum. Brain Mapp. 23, 140 – 155. Seghier, M.L., Boex, C., Lazeyras, F., Sigrist, A., Pelizzone, M., 2005. FMRI evidence for activation of multiple cortical regions in the primary auditory cortex of deaf subjects users of multichannel cochlear implants. Cereb. Cortex 15, 40 – 48. Talairach, J., Tournoux, P., 1988. A Co-planar Stereotaxic Atlas of the Human Brain. Thieme, Stuttgart. Teig, E., Lindeman, H.H., Flottorp, G., Tvete, O., Hanche-Olsen, S., Arntsen, O., 1992. Patient performance with two types of multiple electrode intracochlear implant. Scand. Audiol. 21, 93 – 99. Tzourio, N., Crivello, F., Mellet, E., Nkanga-Ngila, B., Mazoyer, B., 1998. Functional anatomy of dominance for speech comprehension in left handers vs right handers. NeuroImage 8, 1 – 16. Vouloumanos, A., Kiehl, K.A., Werker, J.F., Liddle, P.F., 2001. Detection of sounds in the auditory stream: event-related fMRI evidence for differential activation to speech and nonspeech. J. Cogn. Neurosci. 13, 994 – 1005. Wong, D., Miyamoto, R.T., Pisoni, D.B., Sehgal, M., Hutchins, G.D., 1999. PET imaging of cochlear-implant and normal-hearing subjects listening to speech and nonspeech. Hear. Res. 132, 34 – 42. Woods, R.P., Cherry, S.R., Mazziotta, J.C., 1992. Rapid automated algorithm for aligning and reslicing PET images. J. Comput. Assist. Tomogr. 16, 620 – 633. Worsley, K.J., Evans, A.C., Marrett, S., Neelin, P., 1992. A threedimensional statistical analysis for CBF activation studies in human brain. J. Cereb. Blood Flow Metab. 12, 900 – 918. Zatorre, R.J., Evans, A.C., Meyer, E., Gjedde, A., 1992. Lateralization of phonetic and pitch discrimination in speech processing. Science 256, 846 – 849. Zatorre, R.J., Meyer, E., Gjedde, A., Evans, A.C., 1996. PET studies of phonetic processing of speech: review, replication, and reanalysis. Cereb. Cortex 6, 21 – 30.