Current Biology, Vol. 13, 554–562, April 1, 2003, 2003 Elsevier Science Ltd. All rights reserved. DOI 10.1016/S 09 60 - 98 22 ( 03 )0 0 16 8- 4
Eye Position Affects Activity in Primary Auditory Cortex of Primates Uri Werner-Reiss,1 Kristin A. Kelly,1 Amanda S. Trause, Abigail M. Underhill, and Jennifer M. Groh* Center for Cognitive Neuroscience Department of Psychological and Brain Sciences Dartmouth College Hanover, New Hampshire 03755
Summary Background: Neurons in primary auditory cortex are known to be sensitive to the locations of sounds in space, but the reference frame for this spatial sensitivity has not been investigated. Conventional wisdom holds that the auditory and visual pathways employ different reference frames, with the auditory pathway using a head-centered reference frame and the visual pathway using an eye-centered reference frame. Reconciling these discrepant reference frames is therefore a critical component of multisensory integration. Results: We tested the reference frame of neurons in the auditory cortex of primates trained to fixate visual stimuli at different orbital positions. We found that eye position altered the activity of about one third of the neurons in this region (35 of 113, or 31%). Eye position affected not only the responses to sounds (26 of 113, or 23%), but also the spontaneous activity (14 of 113, or 12%). Such effects were also evident when monkeys moved their eyes freely in the dark. Eye position and sound location interacted to produce a representation for auditory space that was neither head- nor eye-centered in reference frame. Conclusions: Taken together with emerging results in both visual and other auditory areas, these findings suggest that neurons whose responses reflect complex interactions between stimulus position and eye position set the stage for the eventual convergence of auditory and visual information. Introduction Historically, the brain’s sensory pathways have been viewed as separate and distinct from one another. Communication between different sensory modalities is thought to be a higher-level function reserved for association or even premotor areas. Recent work in humans and other animals has begun to challenge this view, showing that, under the right conditions, visual and tactile stimuli can activate areas of the brain that are normally considered auditory [1–8], and auditory and tactile stimuli can activate areas of the brain that are normally considered visual [9–15]. Such cross-modal interactions have usually been attributed to plasticity following damage to the primary sensory system involved [1, 9, 10, 12, 13], have been limited to linguistic stimuli (in humans) [4], or both [5–7, 11]. Signs of intersensory convergence *Correspondence:
[email protected] 1 These authors contributed equally to this work.
for nonlinguistic visual stimuli have not been demonstrated in primary auditory cortical areas of intact animals or humans [16]. In the case of vision and hearing, intersensory communication requires not only the convergence of visual and auditory signals themselves, but also information concerning the position of the visual sense organ, the eye, with respect to the auditory sense organ, the ear. Without knowledge of eye position, the spatial aspects of visual and auditory information cannot be properly equated with one another. Eye position information can in principle be used to accomplish a coordinate transformation to bring head- (ear-) centered auditory information into an eye-centered frame of reference, and this transformation facilitates the interaction of auditory signals with retinotopically ordered visual information (e.g., [17, 18]). We have previously demonstrated that eye position affects auditory responses in primate inferior colliculus (IC; [19]), a subcortical structure with known ties to multisensory processing (for discussion, see [19]). However, the eye position effect in IC did not serve to create an eye-centered reference frame, but instead produced an ambiguous representation of space that was neither head- nor eye-centered, as if it might lie at the midpoint of a gradual coordinate transformation. In the current study, we investigated whether the position of the eyes affects the acoustic responses of single neurons in the primary auditory cortex of two rhesus monkeys (Macaca mulatta). Auditory cortex has been implicated in the processing of sound location in both primates and other species (for review, see [20]). Should eye position affect auditory cortical activity, it would suggest that auditory cortex plays a role in preparing for the eventual convergence between visual and auditory spatial information. We were particularly interested in testing the prediction that the frame of reference should show evidence of a progression from the ambiguous code found in the IC toward an eye-centered coordinate system that would in theory be well suited for facilitating subsequent interactions with visual information. Preliminary reports of these findings have appeared previously [21–23], as has a similar preliminary report from an independent research group [24].
Results Eye Position and Reference Frame Eye position exhibited a statistically significant effect on the activity of about one third of the neurons in primary auditory cortex (areas A1 and R). This effect was present during the responses to sounds in 23% of the population (26 of 113 neurons, two-way analysis of variance with eye position and sound location as the factors, main effect of eye position or interaction term, p ⬍ 0.05). Eye position also affected the spontaneous activity of some neurons while the monkeys fixated visual stimuli but before any sound had been presented (14 of 113 neurons, or 12%, one-way analysis of variance, p ⬍ 0.05). The total number of neurons with significant effects dur-
Eye Position Affects Activity in Auditory Cortex 555
Table 1. Relationship between Eye Position and Sound Location Sensitivity Effect of Eye Position Significant
Effect of Sound Location
Significant Not Significant Totals
Not Significant
Totals
N
%
N
%
N
%
24 11 35
21.2% 9.7% 31.0%
41 37 78
36.3% 32.7% 69.0%
65 48 113
57.5% 42.5% 100.0%
A two-way ANOVA identified neurons with statistically significant effects of eye position and/or sound location (p ⬍ 0.05, main effect or interaction; eye position totals also include neurons sensitive to eye position during spontaneous activity in the absence of a sound stimulus).
ing either or both periods was 35 out of 113, or 31% (Table 1). Active fixation of a visual target was not necessary to reveal an effect of eye position: of 49 cells whose activity was also recorded while the animals were free to move their eyes as they chose, 12 (24%) showed a significant effect of eye position in either the spontaneous activity or their responses to sounds (one-way analysis of variance, horizontal eye position binned in 5⬚ increments, p ⬍ 0.05). In agreement with previous studies [25, 26], 58% of the neurons were sensitive to the locations of sounds (main effect of sound location or interaction term with eye position, 65 of 113 neurons, Table 1). Although sound location-sensitive neurons were somewhat more likely than sound location-insensitive neurons to show sensitivity to eye position (Table 1), this correlation did not reach statistical significance (X2, p ⫽ 0.11). The proportions of cells showing statistically significant effects of eye position were greater than would be predicted by chance alone for each of the subpopulations listed above (one-tailed binomial test, p ⬍ 0.01; for details, see the Supplemental Data, especially Table S1, lines M, T, U, and Y, available with this article online). The estimated size of the proportion of neurons showing an eye position influence depended in part on the amount of data collected. In the 27 neurons with the greatest number of stimulus repetitions (8–78 per condition; mean ⫾ standard deviation: 24.0 ⫾ 9.6), 15 (56%) showed a significant effect of eye position, and 19 (70%) showed a significant effect of sound location. Detection of the presence of an eye position effect did not depend critically on the type of test employed or the particulars of its implementation: Kruskal-Wallis tests, Gaussian, sigmoidal, and linear curve fitting (of neural responses as a function of eye position), and Monte Carlo simulations of one- and two-way ANOVAs (of the effect of eye position and/or sound location on neural activity) all confirmed the existence of an eye position influence, although the sensitivity of each method varied. Details of these additional tests are provided in the Supplemental Data. The degree of influence by eye position appeared to range along a continuum, and the nature of that influence was complex. Figure 1 shows a series of example neurons that were representative of our entire data set (Figures 1A and 1D–1L) and two simulations (Figures 1B and 1C). In these plots, the activity of each real or simulated neuron is illustrated in color as a function of eye position (ordinate) and sound location (abscissa). If eye position had no effect and coding for sound location therefore occurred in a head-centered frame of refer-
ence, no notable variation in responsiveness should occur on the ordinal dimension and a vertically striped appearance should be produced. Figure 1B shows a simulation of this predicted pattern. This simulation is derived from a subset of the actual response patterns of the neuron in Figure 1A, incorporates the observed variability in neural responses, and thereby gives a sense of the appearance that would be expected from chance fluctuations in responsiveness alone (see the Experimental Procedures: Data Analysis for details). A head-centered (vertically striped) pattern similar to this simulation was occasionally observed in real neurons (e.g., Figure 1D), but most showed at least qualitative signs of sensitivity to eye position as well as sound location. The effects of eye position achieved statistical significance (ANOVA: main-effect, interaction, or baseline) for the neurons shown in Figures 1F–1L. The effect of eye position did not resemble an eyecentered frame of reference, in which spatial response functions should shift in the same direction and to the same degree as the eyes. This is evident by comparing the pattern of activity of the real neurons shown in Figure 1C, a simulation showing the diagonally striped appearance that should occur if the effect of eye position served to create an eye-centered frame of reference. The pattern of diagonal stripes indicative of an eyecentered frame of reference was not observed in the real neurons. Instead, the responses of individual neurons illustrated a complex and unpredictable interplay between sound location and eye position. The failure of spatial tuning to shift in space when the eyes move is illustrated in a different format in Figure 2. The top graphs in each panel of Figure 2 show the activity as a function of sound location with respect to the head for the same neurons shown in Figure 1. For clarity and to allow error bars to be displayed, only the responses for three fixation positions (⫾10, ⫾12 and 0) are shown. The bottom graphs in each panel display the same data as the top graphs, but the data are realigned as a function of sound location defined with respect to the eyes. If the neurons employed an eyecentered frame of reference, then a given neuron’s response pattern at the center eye position would have the same general shape as the responses at the eccentric eye positions, but these response patterns would appear to be shifted versions of each other when plotted in head-centered coordinates. Replotting the curves with sound location recoded in an eye-centered reference frame should bring the curves into alignment (bottom graphs). Clearly, it does not. We quantified the alignment of the curves by measuring the average vertical separation between these
Current Biology 556
Figure 1. Mean Responses of Auditory Cortical Neurons as a Function of Sound Location and Eye Position, Compared to Simulations of Head- and Eye-Centered Reference Frames (A–L) (A, D–L) Ten examples of neurons are shown. (B and C) The simulations illustrate hypothetical neurons employing a head-centered frame of reference in which eye position would have no effect (B) and an eye-centered frame of reference in which the location of the auditory receptive field would shift with changes in eye position (C). The simulations are based on the observed responses of the neuron shown in (A) at the center fixation position (indicated by the red triangle). Sampling of eye position and sound location was taken every 5⬚ for the neurons shown in (D), (E), and (G) and every 6⬚ for the remaining neurons (grid lines). Interpolation was used to fill in the color between the sampled points.
curves when plotted in head- versus eye-centered reference frames (see the Experimental Procedures). These values are plotted for the population in Figure 3A. A neuron using a true head-centered frame of reference would have a lower response offset in a head-centered reference frame, and a neuron using a true eye-centered frame of reference would have a lower response offset in an eye-centered reference frame. Points lying along the reference line of slope one indicate responses that are aligned equally well (or poorly) in eye- and headcentered reference frames. The fact that the response offsets are only somewhat biased toward a head-centered reference frame illustrates that neither reference frame is substantially better than the other at capturing the pattern of activity in auditory cortex. Note that this analysis concentrated specifically on assessing the reference frame rather than judging the statistical significance of eye position effects. Neurons could and did have responses that aligned better in a head- versus eye-centered reference frame; however, this did not preclude the occurrence of statistically significant effects of eye position. Differences between the statistical and reference frame analyses could also occur because only a subset of the fixation positions was used for the reference frame analysis (see the Experimental Procedures). Finally, no progression in reference frame is evident as signals ascend the auditory pathway from inferior colliculus to auditory cortex. Figure 3B shows a histo-
gram of the ratio of the head-centered offset to the eyecentered offset for the current auditory cortical data and our previously published results in inferior colliculus [19]. A ratio value near one would lie near the line of slope one in the graph in Figure 3A. Both distributions cluster around the same intermediate zone, a no man’s land between head- and eye-centered coordinate frames. Comparison of Degree of Modulation by Sound Location and Eye Position Sound location was a somewhat more important factor in determining neural activity than eye position. As mentioned above, a greater number of neurons showed a statistically significant effect of sound location than eye position, and Figures 3A and 3B show a modest bias toward a head-centered frame of reference. However, the more surprising aspect of our findings is just how similar the effects of sound location and eye position are in size. Figure 4 employs a commonly used metric of effect size, a modulation index, to illustrate this point. We computed modulation indices for sound location (Ms) and eye position (Me) as follows: Ms or Me ⫽
Rmax ⫺ Rmin * 100 Rmax
(1)
For both indices, the maximum response (Rmax) was the mean response evoked at the best combination of eye
Eye Position Affects Activity in Auditory Cortex 557
Figure 2. Responses of Auditory Cortical Neurons When Sound Location Is Defined with Respect to the Head Versus When Sound Location Is Defined with Respect to the Eyes The top graphs in each panel show responses when sound location is defined with respect to the head, and the bottom graphs show responses when sound location is defined with respect to the eyes. The responses shown are moving averages ⫾ standard error with a 10⬚ or 12⬚ window sliding in 5⬚ or 6⬚ increments, depending on the resolution of the sampling for the neuron in question. The example neurons are the same as those shown in Figure 1, and the three fixation positions included here are marked in Figure 1 by triangles of a corresponding color. ([B] and [C] are deliberately omitted, as the corresponding panels in Figure 1 display simulated data).
position and sound location (Figure 4A). For the sound location index, the minimum response (Rmin) was the mean response at the worst sound location for the same eye position at which the largest mean response was observed. For the eye position modulation index, the minimum response was the mean response at the worst eye position but the same sound location at which the maximum response was observed.
Changes in activity elicited by shifting the position of either the eyes or the sound ranged in size from about 20% to about 80% of the maximum responses evoked, as shown in Figure 4B. The mean modulation by sound location was 46%, whereas the mean modulation by eye position was 40%. Employing a criterion of ⬎50% modulation to identify sensitivity to either factor, as has been used for sound location sensitivity in previous
Current Biology 558
rons included in this study. Unlike the pinna movements of cats [28], the measured ear movements were minimal in size and were not meaningfully related to eye position (see the Supplemental Data). Overall, the average range of mean ear positions across the different fixation positions was 1⬚ (min, 0.25⬚; max, 3⬚).
Discussion
Figure 3. Neither a Head- nor Eye-Centered Frame of Reference Best Captures the Pattern of Activity in Auditory Cortex (A) Average vertical separation between the responses for different eye positions when plotted in a head-centered reference frame (head-centered offset, ordinate) versus the average vertical distance between these responses when plotted in an eye-centered reference frame (eye-centered offset, abscissa) expressed as a percentage of the mean response of each neuron; see the Experimental Procedures for details. This graph presents data from 111 neurons with sampling that included an ipsilateral (⫺10⬚ or ⫺12⬚), center (0⬚), and contralateral (10⬚ or 12⬚) fixation position. Neurons in which the effect of eye position was statistically significant in either the baseline or the response period (either main effect or interaction with sound location) are indicated by open squares; some of these neurons also exhibited main effects of sound location. Those for which only sound location was significant are indicated by open circles. The labels D and J indicate the individual example neurons from Figures 1D and 2D and 1J and 2J, respectively. (B) Comparison between the ratio of the head- and eye-centered response offsets in auditory cortex (white bars) and the IC (dark bars, 71 neurons, previously published in Figure 2, [19]). These distributions did not differ (Wilcoxon rank-sum test, p ⫽ 0.544).
studies in auditory areas (e.g., [27]), 26% of neurons would be classified as eye position sensitive, and 40% of neurons would be classified as sound location sensitive. Ear Movements As we have previously noted, pinna movements do not provide a ready explanation for the effects of eye position [19]. We monitored the horizontal rotation of the contralateral ear during the recording of 44 of the neu-
Given the different reference frames in which visual and auditory spatial information are initially detected, information about eye position is critical for merging, comparing, or reconciling visual and auditory sensations. Although overt visual activation in auditory cortex has as yet only been demonstrated under limited circumstances involving either hearing-impaired humans or other animals with disrupted sensory pathways [2, 3], linguistic stimuli [4], or both [5–7], the presence of eye position signals in auditory cortex suggests that communication between the visual and auditory pathways is more pervasive than previously thought. Furthermore, a direct anatomical projection from auditory cortex to primary visual cortex has recently been identified [29]. The existence of information about the orientation of the visual sense organ in the primary auditory cortex of normal primates, together with the anatomical connection between auditory and visual cortex, suggests that signals relating to the convergence of vision and hearing are a normal aspect of basic auditory processing, rather than a special feature of multisensory areas, damaged sensory systems, or evolutionary adaptations relating to language. The effects of eye position were robust to the behavioral context and occurred both with and without active fixation of visual stimuli. Furthermore, a task requiring the animal to localize sounds with the eyes was not required, although the possibility that the nature of the eye position signal would be affected by such a task is intriguing. Thus, it seems that eye position modulation is a normal occurrence in the auditory pathway, with concomitant implications not only for sound localization, but potentially for other types of auditory processing also. Indeed, these findings suggest a neural mechanism for a synesthesia-like clinical disorder known as gaze-evoked tinnitus, in which certain eye positions elicit percepts of nonexistent sounds [30]. The naturally occurring eye position modulation may be erroneously interpreted as an acoustic event in this disorder. How the eye position signal reaches the auditory pathway is uncertain. Eye position signals are ubiquitous throughout the brain (for a review, see [17]) and could potentially reach the auditory pathway either through descending inputs (such as from the frontal eye fields, [31]) or some as yet undiscovered anatomical projection from oculomotor areas to the auditory pathway. Attempts to distinguish between top-down and bottomup sources of sensory signals often seek evidence from the time course of these effects. When one of several components of a neural response has a time course that is delayed with respect to the others, the delay has been interpreted as evidence that the effect in question is mediated by a top-down connection. However, this ap-
Eye Position Affects Activity in Auditory Cortex 559
Figure 4. Degree of Modulation of Neural Activity Due to Eye Position and Sound Location across the Population of Auditory Cortical Neurons (A) The eye position and sound location modulation indices were computed by finding the overall maximum mean response and comparing that to the minimum mean response at either the sound location or the same eye position at which the maximum response was observed, as shown schematically for a hypothetical neuron. (B) The eye position and sound location modulation indices for the population of 78 neurons with equivalent sampling of sound location and eye position. The letters indicate the modulation index values for the individual neurons shown in Figures 1 and 2.
proach may not be useful for assessing the source of eye position signals, because eye position is not an externally delivered sensory stimulus. The eyes always have a position, which the animal controls (albeit at the behest of the experimenter), and, in our task, the eyes were already fixating the desired visual target well in advance of the acoustic stimulus. Thus, it would be difficult to identify a fine-grained time course of the eye position signal. However, on a coarser scale, the existence of the eye position signal in the spontaneous activity of auditory cortical neurons, but not in inferior
colliculus neurons [19], does make it more likely that the eye position signal descends the auditory pathway from auditory cortex to inferior colliculus rather than the other way around. If eye position is indeed mediated by descending projections, it is theoretically possible for this type of influence to be present at much earlier points in the pathway, potentially as early as the cochlea itself. By modulating the level of responses to acoustic stimuli or the shape of spatial sensitivity, but not causing a systematic shift in receptive field location, the eye position influence renders the activity of individual neurons
Current Biology 560
ambiguous in reference frame. The hypothesis that this type of representation is part of a transition from headto eye-centered coordinates is not supported by our findings, as little has changed as auditory signals ascend from inferior colliculus to auditory cortex. Clearly, then, the existing conceptual framework of explicit coordinate transformations cannot account for the signals present at this point in the auditory pathway. Having abandoned the notions of both head- and eyecentered reference frames (at least for this level of the auditory pathway), it is worth reconsidering the eventual requirements for integration with visual information. Plainly, the presence of eye position information in the auditory pathway is highly relevant to and constrains how auditory signals may ultimately be combined with visual signals. However, the discrepancy between auditory and visual spatial information may not be as great as is commonly assumed. Recent findings from the visual pathway indicate that visual neurons display effects of eye position that are strikingly similar to our own results. Effects of eye position are seen as early as primary visual cortex [32, 33] as well as in other visual, sensorimotor, and parietal areas [34–42]. These experiments cast doubt on the degree to which the visual pathway employs a solely retinotopic frame of reference. Rather than employing explicit head- or eye-centered coding, auditory and visual neurons appear to be selective for combinations of stimulus position and eye position, and their responses are best thought of as a complex function of both parameters. Indeed, computational advantages to this type of code have been suggested [43, 44]. Appropriately combining the input from such visual and auditory neurons might create individual bimodal neurons with the same sensitivity to stimulus and eye position, regardless of whether the stimulus modality is visual or auditory. Such neurons might well subserve the perceptual binding of visual and auditory information, in a frame of reference that is native to neither sensory pathway but common to both. Experimental Procedures General Procedures Two adult female rhesus monkeys served as subjects for these experiments. The Institutional Animal Care and Use Committee at Dartmouth approved all procedures. Animals underwent a surgery to implant a head post for restraining the head and a scleral eye coil for monitoring eye position at a 500 Hz sample rate [45, 46]. All surgical procedures were performed under general anesthesia, and aseptic techniques were used. The animals were then trained by using positive reinforcement to maintain fixation on a visual stimulus at a randomly chosen location along the horizontal meridian while four sounds were presented successively, also at randomly chosen locations. Each trial began with a 500 ms period of fixation, followed by the four sounds (500 ms duration, 500 ms interstimulus interval). Animals received liquid reinforcement for maintaining fixation throughout the series of stimulus presentations. Speaker locations and visual stimuli for our main data set of 78 neurons were: ⫺24 to 24 by 6⬚ of azimuth in one experimental rig and –15 to 20 by 5⬚ in the other (negative numbers indicate ipsilateral locations). An additional 33 neurons were tested with eye positions –12⬚, 0⬚, and 12⬚ and sound locations ⫾90, ⫾44, and –30 to 30 by 6⬚, and two neurons were tested with the latter set of sound locations and eye positions ⫺12 and 12⬚ (total: 113 neurons). All cells except for the two not tested at the center fixation were included in Figure 3; only the set of 78 neurons is shown in Figure 4. The activity of
49 of the 78 neurons was also recorded while the animals were free to move their eyes in the dark. Multiple stimulus repetitions were performed for each combination of sound location and eye position for each recorded neuron (mean number of repetitions per condition: 16.2, standard deviation: 8.8); the total number of stimulus repetitions per recorded neuron ranged from 215 to 3196, with a mean of 1064.9. The auditory stimulus was a band-limited white noise (500 Hz–18 kHz, 51 ⫾ 2 dB SPL). The experiments were conducted in a singlewalled sound attenuation chamber (IAC) lined with sound absorbent foam (3” Sonex One) to reduce echoes. Experiments were conducted in complete darkness. Ear position was also monitored during the recording of 44 of the set of 78 neurons by using a calibrated coil taped to the back of the pinna on the ear contralateral to the recording site. Once the animals were trained, a cylinder was implanted by using stereotaxic techniques to allow access to the left primary auditory cortex in one monkey (monkey G) and the right primary auditory cortex in the other (monkey M). In monkey G, a straight vertical approach was employed [47]. In monkey M, electrodes approached the auditory cortex at an angle of 33⬚ lateral from vertical in the coronal plane. Standard recording techniques were used: electrical potentials were amplified, and action potentials were detected by using a double window discriminator (Bak Electronics). The time of occurrence of action potentials was stored for off-line analysis. The number of action potentials occurring within a 500 ms window following stimulus onset was counted. Spontaneous activity was assessed during a 500 ms window prior to the onset of the first sound in each trial. Neurons were considered auditory if their responses during the 500 ms following sound onset differed significantly from spontaneous activity (two-tailed paired t test, p ⬍ 0.05). The mean latency of our auditory cells was 16 ms, with a standard deviation of 9 ms, which is consistent with other reports from primary auditory cortex [48–50]. Location of Recordings Previous studies in macaques have identified a core region of auditory cortex by using a combination of anatomical, histological, and physiological techniques (for reviews, see [51–53]). The boundaries of the core are consistent (within 1–2 mm) between individual animals [53] and are also similar across studies [54]. The posterior edge of the core region lies 3–5 mm from the posterior end of the supratemporal plane [53, 55–57]. The core is generally completely contained within the lower bank of the lateral sulcus, although it can occasionally drape over onto the lower bank of the circular sulcus below the insula [53]. The lateral border is typically 1–3 mm from the lateral edge of the lateral sulcus [53, 56, 57]. Posterior to the insula and circular sulcus, the medial border of the core region lies approximately 1–3 mm from the medial edge of the supratemporal plane [53, 56, 57]. Subfields within the core and surrounding belt regions have been identified based on their tonotopic organization, but as the axis of frequency organization runs approximately anteriorly/posteriorly, this organization as such is not particularly helpful at distinguishing the medial or lateral borders between neighboring core and belt regions. Similarly, the caudomedial belt region seems to lack a tonotopic organization altogether (for a review, see [53]), which further limits the utility of tonotopy for identifying the boundary at the posterior edge of the core region. However, several previous studies have identified differences in the breadth of frequency tuning and/or the preference for white noise over tonal stimuli between core and neighboring belt regions (e.g., [49, 58]). Given the excellent anatomical information available concerning the geographical location of the core region within the lateral sulcus, the consistency of this anatomical information across individual animals (and even different species of macaques), and the limited utility of tonotopy in distinguishing the core from the belt, we relied primarily on physical landmarks to restrict our data set to the core region. The locations of our recording penetrations were identified by using MRI at the Dartmouth Brain Imaging Center (GE 1.5T scanner, 3D T1 weighted gradient echo pulse sequence, 5” receive-only surface coil). One or more tungsten electrodes were inserted into the brain for the scan; these electrodes were readily visible in the images and served as reference points for the reconstruction of
Eye Position Affects Activity in Auditory Cortex 561
other recording locations. In accordance with the boundaries of the core identified in the literature cited above, we limited our recording locations to those ⱖ5 mm rostral from the caudal end of the supratemporal plane, ⱖ2 mm from the medial end of the supratemporal plane in the region caudal to the insula/circular sulcus, and ⱖ2 mm from the lateral edge of the supratemporal plane. In the region adjacent to the insula/circular sulcus, recording locations were restricted to those within the supratemporal plane rather than dipping down onto the lower/outer bank of the circular sulcus. All recording sites were well caudal of the RT region of core and consisted of sites mainly in putative A1 but potentially also in R. In keeping with previous work, the relative prevalence of sensitivity to tonal stimuli was used to confirm the caudal boundary separating the core from the caudal belt region (e.g., [49, 58]). Data Analysis Simulations in Figures 1B and 1C The responses of the simulated head-centered neuron for each sound location/eye position combination were taken from a normal distribution having the same mean and standard deviation as the actual activity of neuron m067 for that sound location when the eyes were straight ahead (row marked with a red triangle). The same number of trials (ranging from 17 to 43) was used for each simulated stimulus combination as for the corresponding original data, and the activity shown represents the mean of these trials. The simulation of the eye-centered neuron was conducted in the same fashion as the head-centered simulated neuron, except that sound location with respect to the eyes was assumed to govern the responses. For example, the simulated activity for eyes and sound 12⬚ to the left was based on the measured activity of neuron m067 when both eyes and sound were located straight ahead. Predicted activity at eccentric eye-centered angles for which no data were collected were derived from the nearest measured data point. Population Analysis in Figure 3 The following calculations were performed to assess the relative alignment of the responses in head- versus eye-centered reference frames. We quantified the response offset by measuring the average vertical separation between the responses when plotted in headversus eye-centered reference frames as follows: n
offset ⫽ 100 *
兺 abs(Rleft,i ⫺ Rcenter,i ) ⫹ abs(Rright,i ⫺ Rcenter,i) i⫽ 1 2nR
(2)
The mean responses of the cell for each speaker location, i, and eye position (left, ⫺12⬚ or ⫺10⬚; right, 12⬚ or 10⬚; center, 0⬚) are denoted as Rleft,i, Rright,I, and Rcenter,i. For the head-centered response offset, these values were calculated with sound locations defined with respect to the head. For the eye-centered response offset, sound locations defined with respect to the eyes were used. Only the n speaker locations that existed in both head- and eye-centered frames of reference for the three fixation positions were included for this analysis. The average separation between Rleft,i and Rcenter,i and between Rright,i and Rcenter,i was then expressed as a percentage of the mean response, R, of each neuron [19]. A total of 111 neurons were included in this analysis (the 2 cells not sampled at the center fixation position were omitted). Methodological Comparison with Previous Experiments in IC Our previous study in IC was very similar to the experiments described here. One monkey (M) participated in both studies. Overlapping, but not identical, sets of sound locations and fixation positions were employed in the two studies. The comparison between the frame of reference in IC and auditory cortex presented in Figure 3B involves only sound locations and eye positions that were similar between the two studies. In our IC experiments, the time windows for counting spikes were chosen individually for each cell based on the peristimulus time histograms for that cell, and baseline firing rates were subtracted (for details, see [19]). Neither of these methodological differences influenced our findings: the distribution of the ratio of head-centered offset to eye-centered offset was similar for the IC neurons when a fixed response time window of 500 ms was used and baseline response rates were not subtracted (Figure 3B).
Supplemental Data Supplemental Data including additional statistical analyses and a discussion of the possible effects of attention on auditory cortical neurons are available at http://images.cellpress.com/supmat/ supmatin.htm. Acknowledgments We are very grateful to Laura-Ann Petitto and Peter Tse for many helpful comments on the manuscript. We would also like to thank Yale Cohen, Hany Farid, Ryan Metzger, and O’Dhaniel MulletteGillman for providing useful comments on this study and Souheil Inati and Tammy Laroche of the Dartmouth Brain Imaging Center for assistance with MRI. We are grateful for financial support to K.A.K. from National Institutes of Health (NIH) NS 44666-01 and to J.M.G. from the following sources: the Alfred P. Sloan Foundation, the McKnight Endowment Fund for Neuroscience, the Whitehall Foundation, the John Merck Scholars Program, the Office of Naval Research Young Investigator Program, the EJLB Foundation, NIH NS 17778-19, and The Nelson A. Rockefeller Center at Dartmouth. Received: November 28, 2002 Revised: February 17, 2003 Accepted: February 17, 2003 Published: April 1, 2003 References 1. Finney, E.M., Fine, I., and Dobkins, K.R. (2001). Visual stimuli activate auditory cortex in the deaf. Nat. Neurosci. 4, 1171–1173. 2. Sur, M., Garraghty, P.E., and Roe, A.W. (1988). Experimentally induced visual projections into auditory thalamus and cortex. Science 242, 1437–1441. 3. von Melchner, L., Pallas, S.L., and Sur, M. (2000). Visual behaviour mediated by retinal projections directed to the auditory pathway. Nature 404, 871–876. 4. Calvert, G.A., Bullmore, E.T., Brammer, M.J., Campbell, R., Williams, S.C., McGuire, P.K., Woodruff, P.W., Iversen, S.D., and David, A.S. (1997). Activation of auditory cortex during silent lipreading. Science 276, 593–596. 5. MacSweeney, M., Amaro, E., Calvert, G.A., Campbell, R., David, A.S., McGuire, P., Williams, S.C., Woll, B., and Brammer, M.J. (2000). Silent speechreading in the absence of scanner noise: an event-related fMRI study. Neuroreport 11, 1729–1733. 6. Petitto, L.A., Zatorre, R.J., Gauna, K., Nikelski, E.J., Dostie, D., and Evans, A.C. (2000). Speech-like cerebral activity in profoundly deaf people processing signed languages: implications for the neural basis of human language. Proc. Natl. Acad. Sci. USA 97, 13961–13966. 7. Nishimura, H., Hashikawa, K., Doi, K., Iwaki, T., Watanabe, Y., Kusuoka, H., Nishimura, T., and Kubo, T. (1999). Sign language ‘heard’ in the auditory cortex. Nature 397, 116. 8. Schroeder, C.E., Lindsley, R.W., Specht, C., Marcovici, A., Smiley, J.F., and Javitt, D.C. (2001). Somatosensory input to auditory association cortex in the macaque monkey. J. Neurophysiol. 85, 1322–1327. 9. Kujala, T., Huotilainen, M., Sinkkonen, J., Ahonen, A.I., Aldo, K., Hamalainen, M.S., Ilmoniemi, R.J., and Kajola, M. (1995). Visual cortex activation in blind humans during sound discrimination. Neurosci. Lett. 183, 143–146. 10. Kujala, T., Alho, K., Huotilainen, M., Ilmoniemi, R.J., Lehtokoski, A., Leinonen, A., Rinne, T., Salonen, O., Sinkkonen, J., Standertskjold-Nordenstam, C.G., et al. (1997). Electrophysiological evidence for cross-modal plasticity in humans with early- and lateonset blindness. Psychophysiology 34, 213–216. 11. Sadato, N., Pascual-Leone, A., Grafman, J., Ibanez, V., and Deiber, M.P. (1996). Activation of the primary visual cortex by Braille reading in blind subjects. Nature (Lond.) 380, 526–528. 12. Uhl, F., Franzen, P., Lindinger, G., Lang, W., and Deecke, L. (1991). On the functionality of the visually deprived occipital cortex in early blind persons. Neurosci. Lett. 124, 256–259. 13. Uhl, F., Franzen, P., Podreka, I., Steiner, M., and Deecke, L. (1993). Increased regional cerebral blood flow in inferior occipi-
Current Biology 562
14. 15. 16.
17.
18.
19.
20.
21.
22.
23.
24.
25.
26.
27.
28. 29.
30.
31.
32.
33.
34.
35.
36.
37.
tal cortex and cerebellum of early blind humans. Neurosci. Lett. 150, 162–164. Morrell, F. (1972). Visual system’s view of acoustic space. Nature 238, 44–46. Fishman, M.C., and Michael, P. (1973). Integration of auditory information in the cat’s visual cortex. Vision Res. 13, 1415–1419. De Ribaupierre, F., Goldstein, M.H., Jr., and Yenikomshian, G. (1973). Lack of response in primary auditory neurons to visual stimulation. Brain Res. 52, 370–373. Groh, J.M., and Sparks, D.L. (1992). Two models for transforming auditory signals from head-centered to eye-centered coordinates. Biol. Cybern. 67, 291–302. Jay, M.F., and Sparks, D.L. (1984). Auditory receptive fields in primate superior colliculus shift with changes in eye position. Nature 309, 345–347. Groh, J.M., Trause, A.S., Underhill, A.M., Clark, K.R., and Inati, S. (2001). Eye position influences auditory responses in primate inferior colliculus. Neuron 29, 509–518. Middlebrooks, J.C., Xu, L., Furukawa, S., and Macpherson, E.A. (2002). Cortical neurons that localize sounds. Neuroscientist 8, 73–83. Trause, A.S., Werner-Reiss, U., Underhill, A.M., and Groh, J.M. (2000). Effects of eye position on auditory signals in primate auditory cortex. Soc. Neurosi. Abst. 26, 1997. Werner-Reiss, U., Kelly, K.A., Underhill, A.M., Groh, J.M. (2001). Eye position tuning in primate auditory cortex. Soc. Neurosci. Abstr. CD-ROM, 60.2. Kelly, K.A., Werner-Reiss, U., Underhill, A.M., Groh, J.M. (2002). Eye position affects a wide range of auditory cortical neurons in primates. Soc. Neurosci. Abstr. CD-ROM, 845.1 Fu, K.G., Shah, A.S., O’Connell, N., McGinnis, T.M., Schroeder, C.E. (2002). Integration of auditory and eye position signals in auditory cortex. Soc. Neurosci. Abstr. CD-ROM, 220.12. Ahissar, M., Ahissar, E., Bergman, H., and Vaadia, E. (1992). Encoding of sound-source location and movement: activity of single neurons and interactions between adjacent neurons in the monkey auditory cortex. J. Neurophysiol. 67, 203–215. Recanzone, G.H., Guard, D.C., Phan, M.L., and Su, T.K. (2000). Correlation between the activity of single auditory cortical neurons and sound-localization behavior in the macaque monkey. J. Neurophysiol. 83, 2723–2739. Aitkin, L.M., Pettigrew, J.D., Calford, M.B., Phillips, S.C., and Wise, L.Z. (1985). Representation of stimulus azimuth by lowfrequency neurons in inferior colliculus of the cat. J. Neurophysiol. 53, 43–59. Populin, L.C., and Yin, T.C.T. (1998). Pinna movements of the cat during sound localization. J. Neurosci. 18, 4233–4243. Falchier, A., Clavagnier, S., Barone, P., and Kennedy, H. (2002). Anatomical evidence of multimodal integration in primate striate cortex. J. Neurosci. 22, 5749–5759. Coad, M.L., Lockwood, A., Salvi, R., and Burkard, R. (2001). Characteristics of patients with gaze-evoked tinnitus. Otol. Neurotol. 22, 650–654. Romanski, L.M., Tian, B., Fritz, J., Mishkin, M., Goldman-Rakic, P.S., and Rauschecker, J.P. (1999). Dual streams of auditory afferents target multiple domains in the primate prefrontal cortex. Nat. Neurosci. 2, 1131–1136. Trotter, Y., and Celebrini, S. (1999). Gaze direction controls response gain in primary visual-cortex neurons. Nature 398, 239–242. Weyand, T., and Malpeli, J. (1993). Responses of neurons in primary visual cortex are modulated by eye position. J. Neurophysiol. 69, 2258–2260. Guo, K., and Li, C.Y. (1997). Eye position-dependent activation of neurones in striate cortex of macaque. Neuroreport 8, 1405– 1409. Andersen, R.A., and Mountcastle, V.B. (1983). The influence of the angle of gaze upon the excitability of the light-sensitive neurons of the posterior parietal cortex. J. Neurosci. 3, 532–548. Andersen, R.A., Essick, G.K., and Siegel, R.M. (1985). Encoding of spatial location by posterior parietal neurons. Science 230, 456–458. Andersen, R.A., Bracewell, R.M., Barash, S., Gnadt, J.W., and Fogassi, L. (1990). Eye position effects on visual, memory, and
38.
39.
40.
41.
42.
43.
44.
45.
46.
47.
48.
49.
50.
51.
52.
53.
54.
55.
56.
57.
58.
saccade-related activity in areas LIP and 7a of macaque. J. Neurosci. 10, 1176–1196. Duhamel, J.R., Bremmer, F., BenHamed, S., and Graf, W. (1997). Spatial invariance of visual receptive fields in parietal cortex neurons. Nature 389, 845–848. Bremmer, F., Distler, C., and Hoffmann, K.P. (1997). Eye position effects in monkey cortex. II. Pursuit- and fixation-related activity in posterior parietal areas LIP and 7A. J. Neurophysiol. 77, 962–977. Bremmer, F., Ilg, U.J., Thiele, A., Distler, C., and Hoffmann, K.P. (1997). Eye position effects in monkey cortex. I. Visual and pursuit-related activity in extrastriate areas MT and MST. J. Neurophysiol. 77, 944–961. Bremmer, F., Graf, W., Ben Hamed, S., and Duhamel, J.R. (1999). Eye position encoding in the macaque ventral intraparietal area (VIP). Neuroreport 10, 873–878. Wu, S., and Andersen, R.A. (2001). The representation of auditory space in temporo-parietal cortex. Soc. Neurosci. Abstr. 27, Program No. 166.115. Pouget, A., and Sejnowski, T.J. (1997). Spatial transformations in the parietal cortex using basis functions, J. Cogn. Neurosci. 9, 222–237. Van Opstal, A.J., and Hepp, K. (1995). A novel interpretation for the collicular role in saccade generation. Biol. Cybern. 73, 431–445. Judge, S.J., Richmond, B.J., and Chu, F.C. (1980). Implantation of magnetic search coils for measurement of eye position: an improved method. Vision Res. 20, 535–538. Robinson, D. (1963). A method of measuring eye movement using a scleral search coil in a magnetic field. IEEE Trans. Biomed. Eng. 10, 137–145. Pfingst, B.E., and O’Connor, T.A. (1980). A vertical stereotaxic approach to auditory cortex in the unanesthetized monkey. J. Neurosci. Methods 2, 33–45. Recanzone, G.H., Schreiner, C.E., Sutter, M.L., Beitel, R.E., and Merzenich, M.M. (1999). Functional organization of spectral receptive fields in the primary auditory cortex of the owl monkey. J. Comp. Neurol. 415, 460–481. Recanzone, G.H., Guard, D.C., and Phan, M.L. (2000). Frequency and intensity response properties of single neurons in the auditory cortex of the behaving macaque monkey. J. Neurophysiol. 83, 2315–2331. Cheung, S.W., Nagarajan, S.S., Bedenbaugh, P.H., Schreiner, C.E., Wang, X., and Wong, A. (2001). Auditory cortical neuron response differences under isoflurane versus pentobarbital anesthesia. Hear. Res. 156, 115–127. Kaas, J.H., and Hackett, T.A. (2000). Subdivisions of auditory cortex and processing streams in primates. Proc. Natl. Acad. Sci. USA 97, 11793–11799. Schreiner, C.E. (1992). Functional organization of the auditory cortex: maps and mechanisms. Curr. Opin. Neurobiol. 2, 516–521. Hackett, T.A., Preuss, T.M., and Kaas, J.H. (2001). Architectonic identification of the core region in auditory cortex of macaques, chimpanzees, and humans. J. Comp. Neurol. 441, 197–222. Hackett, T.A., Stepniewska, I., and Kaas, J.H. (1998). Subdivisions of auditory cortex and ipsilateral cortical connections of the parabelt auditory cortex in macaque monkeys. J. Comp. Neurol. 394, 475–495. Hackett, T.A., Stepniewska, I., and Kaas, J.H. (1998). Thalamocortical connections of the parabelt auditory cortex in macaque monkeys. J. Comp. Neurol. 400, 271–286. Morel, A., Garraghty, P.E., and Kaas, J.H. (1993). Tonotopic organization, architectonic fields, and connections of auditory cortex in macaque monkeys. J. Comp. Neurol. 335, 437–459. Kosaki, H., Hashikawa, T., He, J., and Jones, E.G. (1997). Tonotopic organization of auditory cortical fields delineated by parvalbumin immunoreactivity in macaque monkeys. J. Comp. Neurol. 386, 304–316. Rauschecker, J.P., Tian, B., Pons, T., and Mishkin, M. (1997). Serial and parallel processing in rhesus monkey auditory cortex. J. Comp. Neurol. 382, 89–103.