NeuroImage
13, Number
6, 2001,
Part 2 of 2 Parts 1 D E a[@
LANGUAGE
Cortical correlates of lexical and syntactic sign language processing Mairead
MacSweeney*‘, Bencie WollS, Michael Brammert, Ruth Campbell*, Anthony David?, Steve CR Williams-f, Philip K. McGuiret
Gemma Calverts,
*Department of Human Communication Science, University College London. tlnstitute of Psychiatry, London. $Department of Language and Communication City University, London. $FMRIB Centre, University of Oxford. ‘Institute of Child Health, University College London. Introduction The processing of written sentences is left lateralised, recruiting the left angular gyrus and Wemicke’s area (Caplan et al.. 2000). However, syntactic processing of spoken language also involves areas in the right hemisphere (Friederici et al., 2000). This may reflect the incorporation of prosody in spoken sentences. It has been suggested that the bilateral activation observed when native signers process American Sign Language sentences (Neville et al., 1998) may also be due to the prosody of the sign language input (Paulesu & Mehler, 1998). If this were the case greater activation of right hemisphere language areas would be predicted during comprehension of sentences than single signs. No neuro-imaging study to date has directly compared the cortical regions involved in lexical and syntactic sign language processing. We therefore contrasted comprehension of British Sign Language (BSL) sentences with comprehension of BSL single signs. Methods Deaf (n=9) and hearing (n=9) native signers and hearing non-signers (n= 10) were tested. Participants performed 21 second blocks of two experimental tasks interspersed by a baseline task. BSL Sentence comprehension - Participants watched a video of a native Deaf signer signing 5 sentences. Their task was to make a button press response to a semantically incorrect sentence (1 of 5). Single sign comprehension - Participants watched the production of 10 single signs. One was a pseudo-sign to which they made a button press response. Baseline condition - A small visual cue was digitally superimposed onto the chin of the still signer. Five cues appeared in each block (4 black, 1 grey). Participants made a button press response to the grey cue. 140 T2” weighted images depicting BOLD contrast were acquired with a slice thickness of 7mm (0.7mm gap) on a 1.5T scanner. Fourteen axial slices were acquired in each volume to cover the whole brain (TR=3sec, TE=40ms). An inversion recovery EPI dataset was also acquired to facilitate registration. Non-parametric methods were used to analyse the 1MRI data (see Bullmore et al., 2000). Results
and Discussion
In support of Neville et al., native signers showed extensive activation in Broca’s and Wemicke’s areas and their right hemisphere homologues when processing BSL sentences (in comparison to baseline). These areas were also recruited during single sign processing. This suggests that the right hemisphere involvement in sign language processing cannot be wholly attributed to the comparison of direct language input (ASL) with written English, as in Neville et al. The direct contrast between conditions supports this argument since greater activation was observed in the left hemisphere in Broca’s and Wemicke’s areas during comprehension of sentences rather than single signs. These data support previous reports of bilateral activation durin g sign language processing. However the prediction that processing sentences would lead to greater right hemisphere activation would was not supported. Reasons for this will be discussed. References Bullmore et al., (2000). Human Brain Mapping, 12, 61-78. Caplan et al., (2000). Brain and Language, 74, 400-402. Friederici, et al., (2000). Brain and Language, 75, 465-477. Neville, et al., (1998). PNAS, 95, 922-929. Paulesu, & Mehler (1998). Nature, 392, 233-234.
S563