Cross-modal Interactions in Attention and Perception; Analysis of Underlying Neural Mechanisms

Cross-modal Interactions in Attention and Perception; Analysis of Underlying Neural Mechanisms

10 IOP 2016 It is well established that the brain’s sensory systems are multimodal—for example, a sound can facilitate the perception of a nearby vi...

40KB Sizes 2 Downloads 51 Views

10

IOP 2016

It is well established that the brain’s sensory systems are multimodal—for example, a sound can facilitate the perception of a nearby visual event while a synchronized visual stimulus can aid perception of a concurrent auditory event. This symposium will examine the neural mechanisms that mediate such cross-modal interactions and associated shifts of attention.

does not need to be predictive of subsequent task-relevant stimuli. Nonetheless, the amplitude of the ACOP is affected by the attentional demands of the task, and it appears to reflect the degree to which the sound captures visual attention to its location. In a task where visual target letters were preceded by spatially non-predictive sounds, the letter targets on the same side as the preceding sound (validly cued) were discriminated more accurately than targets on the opposite side as the sound (invalidly cued). This behavioral effect was accompanied by an ACOP slow potential and lateralized blocking of the occipital alpha rhythm, the amplitudes of which predicted correct discriminations of validly cued targets but not invalidly cued targets. These data suggest that involuntary orienting of attention to lateralized sounds facilitates the processing of visual inputs on the cued side but does not inhibit processing of visual inputs on the opposite side.

doi:10.1016/j.ijpsycho.2016.07.032

doi:10.1016/j.ijpsycho.2016.07.034

383

469

Neural oscillatory deficits in schizophrenia predict behavioral and neurocognitive impairments

Crossmodal auditory stream selection via oscillatory entrainment in a virtual cocktail party

Antigona Martinez Nathan Kline Institute for Psychiatric Research, Orangeburg, United States

Peter Lakatos, Annamaria Barczak, Monica N. O'Connell Nathan Kline Institute, Orangeburg, United States

The onset of visual stimuli is typically accompanied by an eventrelated desynchronization (ERD) of ongoing alpha (7-14 Hz) activity in the visual cortex. In a recent study, we investigated the modulation of the alpha ERD by feature-selective attention in healthy control subjects and in patients with schizophrenia (Sz). It was found that the magnitude of the alpha ERD was attentionindependent, in both Sz patients and control subjects, during the initial interval following stimulus onset. The duration of the ERD, however, was significantly modulated by feature attention such that peak ERD levels for attended stimuli persisted as much as 300 ms longer than for irrelevant (unattended) stimuli. We hypothesize that the prolonged ERD was associated with additional time for stimulus evaluation and response selection processes following the attended stimulus. In Sz patients, we found significant impairment in the attention-related modulation of alpha activity. Moreover, these deficits correlated highly with patients’ impaired performance in the feature-attention task itself, but also in attention/vigilance and visual learning tests frequently used to assess higher-level cognitive dysfunction in Sz.

Previous studies have examined the cocktail party effect – the ability to focus on one auditory stream among a cacophony of competing streams - usually by presenting two speakers and instructing the subject to attend to one of the speakers. The results of these studies provide compelling evidence that viewing the speaker’s face significantly aids speech perception in noisy environments. The main goal of our study was to determine whether the synchronization of a visual stimulus (an LED flash) with one of many rhythmic auditory streams will modulate the representation of this stream in the thalamus (MGB and Pulvinar) and/or A1. We envisioned that this would be signaled by the entrainment of neuronal oscillations to the rhythm of the visually cued auditory stream and a related modulation of auditory responses to tones in this stream. We simulated a “cocktail party” by simultaneously presenting 2, 4 or 6 auditory streams which differed slightly in their rhythm and the frequency of constituting stimuli, which were 2 octave band passed noise bursts centered on a frequency that differed by 0.5 octave across streams. This resulted in a significant amount of temporal and spectral overlap across streams, similar to a “real” cocktail party scenario. Following a 2-minute-long presentation of the auditory streams (1st third of trials), a flashing LED was synced to the presentation rate of one of the streams for 2 minutes (2nd third), and then turned off for the last 2 minutes (final third) of the trial block. To test whether ongoing and auditory stream related thalamocortical neuronal ensemble activity was modulated by the synchronized LED flashes, we recorded the laminar neuroelectric activity of A1 simultaneously with thalamic (MGB or Pulvinar) neuronal ensemble activity using linear array multielectrodes in awake macaques. For the LED synced auditory stream we found significant entrainment in most A1 and thalamic recordings: delta ITC was larger for both the 2nd (LED on), and even the final (LED turned off) third of trials compared to the 1st (LED not turned on yet), and MUA response amplitudes were modulated. This indicates that the representation of the auditory stream was changed by the synchronized visual stimuli in a way that outlasted crossmodal effects. We interpret our results as an indication that similar to cocktail party, the subjects were able to select one stream from

12 Symposium E1 Cross-modal Interactions in Attention and Perception; Analysis of Underlying Neural Mechanisms Organizer: Steven Hillyard (United States)

doi:10.1016/j.ijpsycho.2016.07.033

384 Salient auditory stimuli activate visual cortex involuntarily and improve visual perception Steven Hillyard UC San Diego, CA, United States Salient peripheral sounds activate the visual cortex as evidenced by a lateralized positive slow potential (ACOP), which is larger in amplitude contralaterally to the location of the sound. This visual cortex response is “involuntary” or “automatic” in the sense that the eliciting sounds do not have to be task relevant and their location