How the Barn Owl Computes Auditory Space

How the Barn Owl Computes Auditory Space

Series: Seminal Neuroscience Papers 1978–2017 Science & Society How the Barn Owl Computes Auditory Space Benedikt Grothe1,2,* In a series of seminal...

676KB Sizes 0 Downloads 43 Views

Series: Seminal Neuroscience Papers 1978–2017

Science & Society

How the Barn Owl Computes Auditory Space Benedikt Grothe1,2,* In a series of seminal behavioral and electrophysiological experiments, Knudsen and Konishi studied the mechanisms of hearing. Their 1979 article showed how the barn owl utilizes unique anatomical features for creating a systematic internal representation of auditory space. This established the barn owl as a prime model for studying sensory systems. There are few studies that exemplify the power and influence of neuroethology as strongly as the collective work of Mark Konishi on sound localization in barn owls. After seminal work on the development of sound production in songbirds, Konishi became fascinated by the exceptional sound localization abilities in owls, and established them as a prime animal model for addressing the question of how the brain creates an internal representation of auditory space. Spatial hearing is a nontrivial issue because there is per se no spatial information at the level of the receptor surface in the inner ear (which represents sound frequency). Since the early 20th century, it has been known that our brain uses indirect information (e.g., disparities of sound level and time of arrival of sounds between the two ears) to synthetize auditory space [1]. However, how the computations underlying spatial hearing are performed in the brain had been largely unclear. In an elegant series of studies, Konishi, along with Eric Knudsen, used the behavioral and anatomical adaptations of the

barn owl for nocturnal hunting to study its spatial hearing. Barn owls spontaneously and reliably turn their head towards a new sound source. They also have nonspherical eyes that prevent saccades and, as a result, one can directly assess their gaze by monitoring their head direction. Knudsen and Konishi utilized these features for measuring localization accuracy in unrestrained animals. Furthermore, barn owls belong to the so-called ‘asymmetrical owls’. The left and the right ears point at slightly different vertical directions, and this asymmetry is also supported by special arrangements of the preaural flap and the facial ruff [2]. In several experiments using acoustic signals with different spectra, ear plugging, as well as removal of the facial ruff, reported in their seminal paper published in 1979 [3], Kundsen and Konishi assessed the ability of the barn owl to localize in both the vertical and horizontal plane. The results clearly indicated the importance of interaural time and phase differences (ITD and IPD, respectively) for sound localization in the horizontal plane. Surprisingly, this applied almost across the entire frequency range that the barn owl can hear, which, for birds, is unusually broad (up to 9 kHz). Another interaural cue [interaural level difference (ILD)] is utilized by the barn owl to localize in the vertical plane. This is unusual and relates to the skull asymmetry. Humans (and other species with symmetrical skulls) use ILDs, similar to ITDs, for horizontal sound localization (although ILDs, in contrast to ITDs, are used primarily for higher frequencies, where the shadowing effect of the head is stronger [4]). By contrast, the anatomical asymmetry of the barn owl causes elevation-dependent ILDs. The resulting iso-ILD contours, rather than being vertical, are strongly tilted towards the horizontal plane (Figure 1C). Therefore, iso-ILD contours are orthogonal to their iso-ITD counterparts, which are

roughly vertical (Figure 1B). This unique arrangement allows binaural comparisons for localizing sounds in the horizontal as well as the vertical plane. Combined with the unsurpassed ITD sensitivity, these specializations serve as the basis for the ability of the barn owl to accurately hunt prey, even in total darkness [5]. Moreover, the demonstration that the different binaural components can be systematically isolated in the behavior of the barn owl, even in free-field conditions, opened numerous experimental opportunities for studying binaural processing and resulted in an entire legacy of studies related to barn owl spatial hearing and beyond. Knudsen and Konishi were not only intrigued by the exceptional localization behavior of the barn owl and the ability to study it in the lab, but were also determined to understand its neuronal basis. In a series of seminal studies, they recorded electrophysiologically from the barn owl midbrain, and demonstrated a sound-frequency-independent [6] computational map of head-related sound positions based on ILDs and ITDs [7]. Knudsen continued to study the alignment of this auditory space map with the retinotopic map in the external nucleus of the inferior colliculus (auditory midbrain) and the superior colliculus. Thereby he established the barn owl as a prime model for developmental plasticity and multimodal integration for motor control [8]. Konishi continued exploring the origin of the horizontal space map, namely the question of how ITDs are initially processed in the auditory brainstem [9]. Taking for a moment a step backward in time, Lloyd Jeffress [10] had speculated some 50 years earlier that coincidence detector neurons could form a map of azimuthal space based on systematic arrangements of bilateral delay lines compensating for different ITDs. Carr and Konishi [9]

Trends in Neurosciences, March 2018, Vol. 41, No. 3

115

(A)

higher level ‘computational maps’ were also found. For example, Suga and colleagues described a map of target distance in the cortex of echolocating bats [13]; Heiligenberg and colleagues found systematic arrangements of neurons sensitive to properties of electric fields in the midbrain of weakly electric fish [14]; and, as discussed above, the barn owl exhibits computed maps of auditory space. In a seminal review in 1991, Heiligenberg concluded that ‘Wherever we find behavioral responses guided by continual modulations of a certain stimulus variable, we seem to find an ordered representation of this variable within neuronal maps’ [14].

Sound level (dB)

Auditory smulus L ear

R ear

ITD (C)

(B)

16 0 42 0° –126 –42 μs Elevaon

8

126 0 dB –8 –16

0° Azimuth Figure 1. The Sound Localization Strategy of the Barn Owl. (A) Barn owls can accurately localize prey based on interaural disparities even in total darkness. If, for instance, the sound origin is slightly to the left from the vertical and below the horizontal midline, it will arrive at the left ear slightly earlier (interaural time difference, ITD) and with a somewhat larger amplitude (interaural level difference, ILD). (B,C) Barn owls belong to the group of asymmetrical owls. The asymmetry of their skull does not significantly affect ITDs (B), which provide accurate information for the azimuthal position. However, the asymmetry does affect ILDs (C). As a consequence, isoILD-lines are strongly tilted and, therefore, provide information about the vertical position of a sound source. Reproduced, with permission, from [8].

found that the nucleus laminaris of the barn owl is in the center of a neuronal circuit that almost perfectly matches the delay-line scenario of ITD processing (confirmed later in chickens by Rubel and colleagues [11,12]). Knudsen and Konishi’s discovery of topographic representations of ITD and ILD at the level of the barn owl midbrain, combined with Carr and Konishi’s finding of a ‘Jeffresslike’ anatomical arrangement in the bird auditory brainstem, appeared to answer 116

Trends in Neurosciences, March 2018, Vol. 41, No. 3

the longstanding question of sound localization in birds and beyond. Moreover, it significantly strengthened the longstanding idea that topographic representations (‘maps’) are a key feature of vertebrate brains. In fact, topographic representations of important sensory cues seemed to be a common feature described in neuroethology research of the 1970s. Besides the known receptor surfacebased maps (e.g., the dominating retinotopic arrangement in the visual system),

Interestingly, the beauty of the barn owl sound localization system almost obscured the fact that the mammalian sound localization system is profoundly different, particularly in the computations mediating ITDs and in the neuronal representation of auditory space. In mammals, ITDs are not arranged in a topographic computational map, neither at the level of ITD processing in the auditory brainstem, nor at higher processing stages, including the superior colliculus [15]. These differences probably stem from the distinct evolutionary origins of spatial hearing in mammals and birds [4]. Overall, then, birds and mammals can both localize sounds in space, but use different neuronal strategies to encode sound location. More broadly, one can conclude that topographic maps per se are not a prerequisite for systematically processing specific stimulus parameters. In fact, fixed topographic representations are now rarely considered as a neuronal ‘solution’ for sensory tasks that require coping with highly complex and dynamic situations. The barn owl, in many ways, stands out when it comes to the neural basis of sound localization. Although birds appear to solve auditory tasks similarly to mammals, they use, at least in some cases,

W. (1991) The neuronal basis of behaviour: a different neuronal strategies; in addition, 14. Heiligenberg, neuroethological view. Annu. Rev. Neurosci. 14, 247–267 the barn owl is in some aspects a partic- 15. Campbell, R.A. et al. (2005) Interaural timing cues do not contribute to the map of space in the ferret superior colliularly special bird. This uniqueness does culus: a virtual acoustic space study. J. Neurophysiol. 95, not take anything from the beauty of the 242–254 barn owl work. In fact, it makes an even more-appealing case from the standpoint of evolutionary biology. It highlights the diversity of how different brains synthetize Series: Seminal Neuroscience sensory information, a diversity that is not Papers 1978–2017 only interesting from a neuroethological perspective, but also helps us better Science & Society understand individual species and, through comparison, general evolutionary principles.

The Memory Map of Visual Space

Acknowledgment B.G. thanks Magdalena Götz and Michael H. Myoga

Román Rossi-Pool,1 José Vergara,1 and Ranulfo Romo1,2,*

for their helpful comments. 1

Division of Neurobiology, Department Biology II, LudwigMaximilians-Universitaet Munich, Martinsried, Germany 2 Max Planck Institute of Neurobiology, Martinsried, Germany *Correspondence: [email protected] (B. Grothe). https://doi.org/10.1016/j.tins.2018.01.004 References 1. Rayleigh, L. (1907) On our perception of sound direction. Philos. Mag. 13, 214–232 2. Payne, R.S. (1971) Acoustic location of prey by barn owls (Tyto alba). J. Exp. Biol. 54, 535–573 3. Knudsen, E.I. and Konishi, M. (1979) Mechanisms of sound localization in the barn owl (Tyto alba). J. Comp. Physiol. 133, 13–21 4. Grothe, B. et al. (2010) Mechanisms of sound localization in mammals. Physiol. Rev. 90, 983–1012 5. Konishi, M. (1971) Sound localization in the barn owl. J. Acoust. Soc. Am. 50, 148 6. Knudsen, E.I. and Konishi, M. (1978) Space and frequency are represented separately in auditory midbrain of the owl. J. Neurophysiol. 41, 870–884 7. Knudsen, E.I. and Konishi, M. (1978) A neural map of auditory space in the owl. Science 200, 795–797 8. Knudsen, E.I. (2002) Instructed learning in the auditory localization pathway of the barn owl. Nature 417, 322–328 9. Carr, C.E. and Konsihi, M. (1990) A circuit for detection of interaural time differences in the brain stem of the barn owl. J. Neurosci. 10, 3227–3246 10. Jeffress, L.A. (1948) A place theory of sound localization. J. Comp. Physiol. Psychol. 41, 35–39 11. Overholt, E.M. et al. (1992) A circuit for coding interaural time differences in the chick brainstem. J. Neurosci. 12, 1689–1708 12. Seidl, A.H. et al. (2010) Mechanisms for adjusting interaural time differences to achieve binaural coincidence detection. J. Neurosci. 30, 70–80 13. Suga, N. and O’Neill, W.E. (1979) Neural axis representing target range in the auditory cortex of the mustache bat. Science 206, 351–353

A 1989 paper by Patricia GoldmanRakic and colleagues reported that the prefrontal cortex coded the visual space during working memory. This landmark work not only offered a biological explanation for this cognitive function, but also opened up a wide field of research aimed at understanding the biological bases of various cognitive functions. More than two thousand years ago, based on an intellectual tour de force, the Greek philosopher Democritus (430–420 BC) depicted a process that he saw as the raw material for sensation, perception, learning, memory, and action. He suggested that in our interactions with objects in the environment, atoms – which Democritus posited constitute the basic elements of material objects – reach the brain, where they generate dynamic images that are processed for thinking [1]. In this manner, the subject could voluntarily use these internal representations to guide thoughts and actions. Impressively, Democritus’ suggestion lies at the core of a working hypothesis that many contemporary scientists use to investigate where

and how in the brain a sensory representation transforms into perception, memory, and action. Adrian [2] was among the first to scientifically test this ancient hypothesis; he recorded the peripheral fibers innervating skin receptors, and observed how their firing rates varied as a function of the stimulus strength applied to the skin. This experiment opened a vast field of research aimed at elucidating how sensory inputs are represented in the peripheral and central nervous systems. In addition, this experiment helped define new questions relating to the cognitive processing of sensory inputs. For example, what components of the neuronal responses evoked by stimuli are in fact related to perception and action? Where and how in the brain is sensory information stored in memory? How does stored sensory information combine with current information and how is this linked to behavior? How do sensory transformations interact with other brain processes? The 1989 paper of Shintaro Funahashi, Charles Bruce, and Patricia GoldmanRakic [3] addressed one important component of these questions, in asking how remembered information about the visual space is encoded in the brain. In brief, the authors discovered that neurons of the prefrontal cortex (PFC) can encode information about specific locations in the visual space when these are remembered over a timescale of several seconds. In other words, the study suggested that PFC neuronal responses represented a mnemonic code of visual space. Significant scientific findings often build upon the researchers’ prior long-standing expertise and an existing body of knowledge. This study was no exception, and in addition, the complementary skills of the different authors proved instrumental as well. Patricia Goldman-Rakic was a prominent expert on cortical connectivity (among other things) and had shown that the PFC received afferent inputs from the parietal cortex, an area associated with

Trends in Neurosciences, March 2018, Vol. 41, No. 3

117