Interaction between auditory and visual stimulus relating to the vowel sounds in the auditory cortex in humans: a magnetoencephalographic study

Interaction between auditory and visual stimulus relating to the vowel sounds in the auditory cortex in humans: a magnetoencephalographic study

International Congress Series 1278 (2005) 177 – 180 www.ics-elsevier.com Interaction between auditory and visual stimulus relating to the vowel soun...

174KB Sizes 0 Downloads 64 Views

International Congress Series 1278 (2005) 177 – 180

www.ics-elsevier.com

Interaction between auditory and visual stimulus relating to the vowel sounds in the auditory cortex in humans: a magnetoencephalographic study Kensaku Mikia,b,*, Shoko Watanabeb, Ryusuke Kakigib b

a Japan society for the Promotion Science, Japan Department of Integrative Physiology, National Institute for Physiological Sciences, 38 Nishigonaka Myoudaiji, Okazaki, Aichi 444-8585, Japan

Abstract. We investigated the difference of M100’s peak latency, maximum amplitude, dipole location and moment, following vowel sound /a/, with two conditions: (a) showing static face with closed mouth, (b) showing the same face with mouth movement using an apparent motion, appearing to pronounce /a/. In results, there were no significant differences in the M100 between the two conditions. These results indicated that vowel sound perception in the auditory cortex was not affected by mouth movement, at least in the primary acoustic information process. D 2004 Published by Elsevier B.V. Keywords: Magnetoencephalography; MEG; Speech; Interaction; Auditory; Visual; Mouth movement; Vowel sounds; M100; Heschl’s gyrus

1. Introduction In our daily lives, integration of visual and auditory stimuli is very important, especially for speech perception [1]. In human studies using functional magnetic resonance imaging (fMRI) [2], positron emission tomography (PET), electroencephalography (EEG), and magnetoencephalography (MEG), interactions between visual and auditory stimuli were reported. Therefore, we investigated whether the activity of auditory cortex was affected by visual motion stimuli using MEG, which has high temporal and spatial resolution. In this study, we focused on the early acoustic information processing stage by analysing M100, reflecting early activity in the auditory cortex. We used the apparent motion as * Corresponding author. Department of Integrative Physiology, National Institute for Physiological Sciences, 38 Nishigonaka Myoudaiji, Okazaki, Aichi 444-8585, Japan. Tel.: +81 564 55 7814; fax: +81 564 52 7913. E-mail address: [email protected] (K. Miki). 0531-5131/ D 2004 Published by Elsevier B.V. doi:10.1016/j.ics.2004.11.015

178

K. Miki et al. / International Congress Series 1278 (2005) 177–180

Fig. 1. Two stimulus conditions.

visual stimulus as in the previous study [3]. We already reported this study [4] and summarized it in this article. 2. Methods We studied 10 right-handed subjects [9 males and 1 female (mean age 32.2 years)], with normal auditory acuity and bnormal and correctedQ visual acuity. All subjects gave informed consent to participate in this experiment, which was approved by the Ethics Committee at National Institute for Physiological Sciences. In this study, we used one vowel sound /a/, spoken by Japanese female, as auditory stimulus. We used visual stimuli as follows: (1) face with closed mouth (S1, S2a and S3 in Fig. 1); (2) face with opened mouth (S2b in Fig. 1); (3) Filler was made by dividing and randomizing the stimulus (1) (Fig. 1). We used an apparent motion, which S1 was replaced by S2a or S2b, then by S3 with no interstimulus interval. S1, S2a, and S2b were present for 800 ms, S3 for 400 ms, and Filler was presented at random between 600 and 800 ms. The vowel sound /a/ was presented for 240 ms on S2a or S2b onset, and we compared two conditions as follows: (1) A (AUDITORY): S2a was presented, no subjects perceived speech motion; (2) M & A (MOTION and AUDITORY): S2b was presented, all subjects perceived speech motion.

Fig. 2. Waveforms were evoked in Subject 1 by two conditions, A (blue line) and M & A (red line). One was the waveform at the channel, which showed the maximum amplitude in the right hemisphere, and 2 in the left hemisphere. (For interpretation of the references to colour in this figure legend, the reader is referred to the web version of this article).

K. Miki et al. / International Congress Series 1278 (2005) 177–180

179

Table 1 The maximum amplitude and peak latency of M100 in the two conditions A M&A

Latency (ms) Amplitude (fT/cm) Latency (ms) Amplitude (fT/cm)

Right (n=10)

Left (n=9)

86.3F10.4 135.7F25.0 88.6F9.5 134.5F23.3

90.0F5.8 121.5F42.2 92.9F6.9 119.0F40.0

Visual stimulus was presented by a personal PC and video projector housed outside the magnetically shielded room, and auditory stimulus was presented to both right and left ears by plastic tube and ear pieces above 70 dB. Visual stimuli were projected centrally, and subjects gazed at the cross which was at the top of the nose. Visual angle was 6.986.98. We used a whole-head 306 channels biomagnetometer, 204 gradiometers, and 102 magnetometers, VectorView (Elekta Neuromag Oy, Helsinki, Finland). In this study, we analyzed results obtained by 204 gradiometers. MEG, vertical and horizontal electrooculograms (EOGs) were simultaneously recorded with a bandpass filter 0.1–50 Hz and a sampling rate 998 Hz. The Epochs in which signal variations were larger than 3pT in MEG amplitude and 150 AV in EOGs amplitude were excluded from averaged data. The averaged data of MEGs were recorded and analyzed 100 ms before and 150 ms after S2 onset, and 100 ms before S2 onset was used as the baseline. We investigated the component of the maximum amplitude in 204 gradiometers in detail. After that, we estimated the dipole location and moment from 14 to 20 channels around the one which showed the maximum amplitude. We accepted only the dipoles fulfilling the following criteria: (1) goodness of fit was more than 95%; (2) the dipoles were located in the Heschl’s gyrus (HG); (3) the dipole location was stable within 0.5 cm and within 5 ms before and after the peak latency. We used paired t-tests to assess significance differences in the maximum amplitude, peak latency, dipole location, and moment between two conditions, and pb0.05 was considered to be significant. 3. Results We investigated the maximum amplitude and peak latency of the prominent component, M100, which was evoked by auditory stimulus about 90 ms after stimulus onset (Fig. 2).

Fig. 3. Subject 1’s dipoles estimated in the two conditions, A (blue circle) and M & A (red circle), overlaid on Subject 1’s MRI images. (For interpretation of the references to colour in this figure legend, the reader is referred to the web version of this article).

180

K. Miki et al. / International Congress Series 1278 (2005) 177–180

Table 2 The dipole locations and moment estimated in the two conditions Right (n=9) X (mm) Y (mm) Z (mm) Dipole moment (nAm)

Left (n=6)

A

M&A

51.2F5.0 19.8F6.0 57.5F7.2 60.5F17.2

51.2F5.7 20.7F6.3 57.5F6.6 58.6F15.9

A

M&A 52.3F5.0 12.2F3.2 60.3F6.2 58.2F17.4

51.9F4.3 11.8F3.9 60.1F6.4 56.2F13.9

X is positive to right, Y to anterior, and Z to superior.

M100 was clearly recorded in both conditions in all 10 subjects from the right hemisphere and 9 from the left hemisphere. There were no differences of the maximum amplitude and peak latency between both conditions (Fig. 2 and Table 1). Next, we estimated the dipole location and moment in nine subjects from the right hemisphere and six from the left hemisphere. We compared the dipole location and moment estimated in the two conditions. The dipoles estimated in both conditions were located in HG, the auditory cortex (Fig. 3). The dipole location and moment show no significant differences between both conditions (Table 2). 4. Discussion The obtained findings that there was no difference in the activity of the auditory cortex between only auditory stimulus and pairing of auditory and visual stimuli relating to the vowel sound, at least within 100 ms following stimulation, indicated that its activity was not influenced by visual motion given simultaneously. In the fMRI study, Laurienti et al. [5] reported that there were no significant differences in the auditory cortex between a combined visual–auditory stimulus and a pure auditory stimulus. Poremba et al. [6] reported that the auditory cortex of the rhesus monkey corresponding to Heschl’s gyrus in humans was not activated by visual stimuli but only by auditory stimuli. The results of this study were consistent with these previous studies. We presume that the effects reported in the previous study [2] take place in the later processing period out of the primary auditory cortex, such as STS or STG. References [1] H. McGurk, J. MacDonald, Hearing lips and seeing voices, Nature 264 (1976) 746 – 748. [2] T.M. Wright, et al., Polysensory interactions along lateral temporal regions evoked by audiovisual speech, Cereb. Cortex 13 (2003) 1034 – 1043. [3] K. Miki, et al., Magnetoencephalographic study of occipitotemporal activity elicited by viewing mouth movements, Clin. Neurophysiol. 115 (2004) 1559 – 1574. [4] K. Miki, S. Watanabe, R. Kakigi, Interaction between auditory and visual stimulus relating to the vowel sounds in the auditory cortex in humans: a magnetoencepharographic study, Neurosci. Lett. 357 (2004) 199 – 202. [5] P.J. Laurienti, et al., Deactivation of sensory-specific cortex by cross-modal stimuli, J. Cogn. Neurosci. 14 (2002) 420 – 429. [6] A. Poremba, et al., Functional mapping of the primate auditory system, Science 299 (2003) 568 – 572.