How does the insect central complex use mushroom body output for steering?

How does the insect central complex use mushroom body output for steering?

Current Biology Magazine Correspondence How does the insect central complex use mushroom body output for steering? Matthew Collett1,* and Thomas S. ...

118KB Sizes 0 Downloads 98 Views

Current Biology

Magazine Correspondence

How does the insect central complex use mushroom body output for steering? Matthew Collett1,* and Thomas S. Collett2,* Research on central brain areas in Drosophila and other insects is revealing the highly conserved neural circuitries in the central complex that are responsible for course control using visual, ideothetic and compass cues [1,2], and in the mushroom bodies that hold long-term visual and olfactory memories [3,4]. Interactions between these areas are likely to be particularly important for navigation in which long-term memories determine an insect’s course. Many ants, for example, use long-term visual memories for guidance along routes between their nest and food sites. But the interactions remain a puzzle: both because there are no known direct connections between mushroom body and central complex, and because the output from the mushroom body, where the route memories are probably stored [5], may simply signal whether a sensory input is attractive or aversive [4]. Extrapolating from a recent behavioural finding [6], we propose one way that the long-term memories in the mushroom body may be transformed into central complex steering commands. This answer, if correct, may reconcile two apparently conflicting ways of thinking about route following — suggesting how steering along a route can use a feedback controller based on a few prominent features [7], while the route memories themselves are holistic memories of the entire panorama [5]. It also suggests how visual navigation is related to (and possibly evolved from) visual targeting and olfactory-based guidance. Behavioural experiments show that the long-term memories of a route are encoded retinotopically [7,8]. Such retinotopicity means that

the mushroom body, by recognising when its retinal input is familiar, can determine whether the head is pointing along the direction of a remembered route. When it is, the mushroom bodies will provide bilateral attraction signals and the insect need simply move forwards [5]. But if dragging a large food item, ants can also walk backwards along their routes. Because they are facing the wrong way, the retinal image is then unfamiliar and so the retinotopic memories cannot guide the ants’ movement. A fine-grained analysis of this behaviour suggests that the ants set their direction from their retinotopic memories only sporadically, during brief forward peeks along the route. After peeking, they turn and walk backwards along the route direction, now guided by celestial compass cues [6]. The switch from scene recognition to celestial guidance demonstrates that using long-term memories to set

a travel direction can be distinct from subsequent short-term course control along that direction. This functional division maps easily onto the mushroom body and central complex circuitries, with recognition through the long-term visual memories of the mushroom bodies setting the celestial course control by the central complex. Most importantly, the switch suggests that there need not be any direct transfer of spatial information from the mushroom bodies to the central complex. The central complex receives its own sensory input for the celestial compass [9]. When the mushroom bodies recognize a view, they need only broadcast an attraction signal. When the central complex receives this signal, it can fix its current sensory inputs as short-term set-points for a feedback steering controller. Forward route following, in which the visual scene determines both recognition and steering [7], can be explained by a similar course-setting

A

Bi

Circuitry for route following Retina

Antennae

Optic lobes feature extraction

Antennal lobes odour extraction

DRA

Long-term retinotopic memory in MB recognises panoramic view

ii ‘whither’ Central complex feature tracking

‘whether’ Mushroom bodies scene/odour recognition

Sub-oesophogial ganglia motivational signals

Attentional competition CX selects salient, generally frontal, visual feature

iii Thoracic ganglia rotation/thrust commands

Image-matching for recognition

Image-matching for steering Visual feedback controller in CX tracks retinal position of selected feature to generate rotation

Figure 1. How the mushroom bodies might direct the central complex without transferring spatial information. The central complex acquires its own short-term memories and sensory inputs for steering — and a bilateral signal from the mushroom bodies indicates when the central complex should acquire those steering memories. (A) Schematic of the visual recognition and steering streams. These streams share some pre-processing in the optic lobes, but use the visual input in different ways. The mushroom body recognition (‘whether’) stream, shared with olfaction, filters the panoramic retinal input through a motivation-dependent long-term memory and outputs (fuzzy arrow) whether the retinotopic view is attractive. The central complex steering (‘whither’) stream is also multi-modal, but uses short- and medium-term memories to track selected features and output rotational and thrust commands. (B) The three proposed visual processes in the two streams that together determine heading. (i) ‘Image matching for recognition’ filters the entire panorama (solid black shapes, dashed line indicates the visual midline) through the long-term route memories in the mushroom body [5]. Experiments using static manipulations of a panorama to probe the content of long-term memories (for example [8]) inform about this process. The output is a signal triggering (ii) an attentional competition within central complex for a guidance feature. Frontal vision biases selection towards prominent features near the route direction. The retinal position of the selected feature (here an oriented edge) becomes a set-point for steering. (iii) ‘Image matching for steering’ in central complex uses only the attentionally selected feature. Tracking the feature may use both the current visual inputs to the fan-shaped body and the expected position of the selected feature from ideothetic updating in the ellipsoid body (reducing visual aliasing). The output, from comparing the current position with the set-point, is a steering command. Experiments perturbing a panorama dynamically (for example [7]) can inform about this process.

Current Biology 28, R719–R736, July 9, 2018 © 2018 Elsevier Ltd.

R733

Current Biology

Magazine process acting through multiple visual processing streams (Figure 1A). The input from the eyes is processed in three optic neuropils that retain the retinotopic structure of the image. From the second neuropil, the medulla, one stream, likely responsible for the long-term route memories, splits off to the mushroom body where it joins cues from other modalities such as olfaction [3]. Additional streams lead to the various structures of the central complex [1,2]. A set of neurons in one of these structures, the ellipsoid body, integrates visual and ideothetic cues to track the orientation of the head with respect to a currently attended feature [2]. These neurons are possible candidates for encoding the rotational distance used by a feedback controller to regain a desired retinal image. Such a controller would be able to generate the rapid saccade-like turns (250–500 ˚/s) that route-following ants can use to regain the retinal position of specific features [7]. We suggest that the attraction signal broadcast by the mushroom body after recognising a familiar view can trigger multiple sensory cues to be fixed as set points for steering: including not just the compass direction, but also salient visual features at the retinotopic positions experienced when the signal arrives. As the mushroom body signal is essentially only a timing signal, it need not reach the central complex through a direct neural pathway. Moreover, the mushroom body signal may trigger the rotational fixations, averaging 200 ms, that are observed when insects face the desired direction (for example [7]), thus allowing plenty of time for the signal to arrive at the central complex and fix the set-points. The visual inputs to the mushroom body and central complex streams are retinotopic, but not a uniform array of pixels. The processing in the optic neuoropils is likely to extract features such as oriented edges [10] and, arising partly from acute frontal zones and binocular overlap, the frontal visual field has greater representation. Behavioural experiments suggest that in the long-term route memories, the frontal visual field has greater prominence, and there may be some encoding of features [8]. But the most important consequences are likely R734

to be for feature-based steering by the central complex. When travelling through cluttered environments, steering by a feedback controller may be most robust if at any time a single feature aligned close to the direction of travel is selected for guidance. Restricting attention to a single frontal feature for steering, accompanied by inhibition of surrounding features, is widespread across insects — demonstrated for example during course-holding or targeting by Drosophila and ladybirds — and so may predate the evolution of longterm route following. We propose that once the mushroom body has recognized that the ant is facing along its route, the attraction signal it broadcasts initiates an attentional competition between the visual features inputting to the central complex. Because of the over-representation of frontal vision, a visual feature near the frontal midline will outcompete more lateral features and so become a set-point for steering (Figure 1B). Successive recognition signals along a route will trigger the attentional selection of a sequence of features, as new features come to occupy the frontal visual field. The selection of guidance elements suggests a possible role for the fan-shaped body in the visual feedback control. This structure in the central complex can hold memories, lasting for minutes, of visual attributes such as edge orientations and retinal elevations of an attractive feature [1]. The visual properties of the selected guidance element are encoded in tangential neurons [1], while the outputs of columnar neurons could potentially encode the retinotopic positions at which those visual properties occur. The fan-shaped body might thus keep track of the attended feature within the visual scene, outputting its updated retinal position (and it might also be involved in the attentional competition). From the point of view of the central-complex steering circuitry, as long as the head is pointing in the appropriate direction when the attraction signal arrives, it doesn’t matter whether that signal is triggered by a visual memory, an olfactory memory, or even an innately attractive visual target or pheromone. The

Current Biology 28, R719–R736, July 9, 2018

mechanisms for route following proposed here would thus apply equally well to (and may in part have evolved for) the ‘cast and surge’ guidance up an odour plume. When the mushroom body signals an encounter with an attractive odour, the insect faces upwind. At that point in time, compass and salient visual features can become set points for central complex steering, and thus be used for guidance during brief upwind surges, even if the wind or odour disappears. The same type of attraction signals, acting on the central complex in the same way, could potentially also be initiated without involvement of the mushroom body by innate pheromone and visual target detectors. REFERENCES 1. Liu, G., Seiler, H., Wen, A., Zars, T., Ito, K., Wolf, R., Heisenberg, M., and Liu, L. (2006). Distinct memory traces for two visual features in the Drosophila brain. Nature 439, 551–556. 2. Seelig, J.D., and Jayaraman, V. (2015). Neural dynamics for landmark orientation and angular path integration. Nature 521, 186–191. 3. Vogt, K., Aso, Y., Hige, T., Knapek, S., Ichinose, T., Friedrich, A.B., Turner, G.C., Rubin, G.M., and Tanimoto, H. (2016). Direct neural pathways convey distinct visual information to Drosophila mushroom bodies. eLife 5 pii: e14009. https://doi.org/10.7554/ eLife.14009. 4. Cognigni, P., Felsenberg, J., and Waddell, S. (2018). Do the right thing: neural network mechanisms of memory formation, expression and update in Drosophila. Curr. Opin. Neurobiol. 49, 51–58. 5. Webb, B., and Wystrach, A. (2016). Neural mechanisms of insect navigation. Curr. Opin. Insect Sci. 15, 27–39. 6. Schwarz, S., Mangan, M., Zeil, J., Webb, B., and Wystrach, A. (2017). How ants use vision when homing backward. Curr. Biol. 27, 401–407. 7. Lent, D.D., Graham, P., and Collett, T.S. (2010). Image-matching during ant navigation occurs through saccade-like body turns controlled by learned visual features. Proc. Natl. Acad. Sci. USA 107, 16348–16353. 8. Buehlmann, C., Woodgate, J.L., and Collett, T.S. (2016). On the encoding of panoramic visual scenes in navigating wood ants. Curr. Biol. 26, 2022–2027. 9. Homberg, U., Heinze, S., Pfeiffer, K., Kinoshita, M., and el Jundi, B. (2011). Central neural coding of sky polarization in insects. Philos. Trans. R. Soc. Lond. B 366, 680–687. 10. O’Carroll, D. (1993). Feature-detecting neurons in dragonflies. Nature 362, 541.

1

Centre for Research in Animal Behaviour, Department of Psychology, University of Exeter, Exeter EX4 4QG, UK. 2School of Life Sciences, John Maynard Smith Building, University of Sussex, Brighton BN1 9QG, UK. *E-mail: [email protected] (M.C.), [email protected] (T.S.C.)