Tool Use: Two Mechanisms but One Experience

Tool Use: Two Mechanisms but One Experience

Current Biology Dispatches Tool Use: Two Mechanisms but One Experience Tobias Heed €tsstr. Faculty of Psychology and Sports Science and Cluster of E...

450KB Sizes 0 Downloads 39 Views

Current Biology

Dispatches Tool Use: Two Mechanisms but One Experience Tobias Heed

€tsstr. Faculty of Psychology and Sports Science and Cluster of Excellence ‘‘Cognitive Interaction Technology’’, Bielefeld University, Universita 25, 33615 Bielefeld, Germany Correspondence: [email protected] https://doi.org/10.1016/j.cub.2019.10.062

Humans localize touch on hand-held tools by interpreting the unique vibratory patterns elicited by impact to different parts of the tool. This perceptual strategy differs markedly from localizing touch on the skin. A new study shows that, nonetheless, touch location is probably processed similarly for skin and tool already early in somatosensory cortex. Tool use is central in all aspects of human life, from basic everyday tasks such as preparing and consuming food, all the way to high-level cultural achievements through creating buildings, machines, and art. One truly remarkable aspect of tools is our subjective experience that we can feel with them: just pick up your pen and write a couple of words on paper. Where do you feel the writing to take place? Most people reply that they are actually ‘touching’ the paper with the pen tip. But how can this touch-like, detailed experience emerge when our sensory system ends at the hand, and we have no sensors in the tool? Touching a tool causes it to vibrate, and these vibrations are unique to the touched location [1]. Humans can successfully dissociate touch locations on a tool even when they use it for the first time, and a biologically plausible neural network with human-like tactile-sensory characteristics can differentiate these vibrations within 25 ms [1]. Reporting in this issue of Current Biology, Miller et al. [2] now show that — despite the obvious lack of sensors in the tool itself — cortical processing of location is surprisingly similar for tool and body, suggesting that the two may be processed equivalently in the cortex. This new finding of Miller et al. [2] is surprising because localization on the skin most likely relies on a very different neural mechanism in the body periphery than the interpretation of tool touch via vibration. The human skin contains many mechanoreceptors that connect to somatosensory neurons. A given receptor will detect touch on a circumscribed region of the skin, and accordingly, its connected somatosensory neuron will respond only to touch on the region covered by its receptors (see Figure 1, lower callout, left) [3,4]. Thus, where the body was touched

can be derived from which neurons respond to the touch — a coding scheme often referred to as place code. In contrast, when the hand grasps a tool, a large area of skin touches the tool shaft and the hand’s grip remains the same no matter where the tool is touched (see Figure 1, lower callout, right). Therefore, the brain cannot derive touch location on the tool in the same way as for touch on the skin. Instead, a specific type of receptor particularly tuned to vibration detects the tool’s vibrations that transmit to the hand. It is the vibratory pattern, termed the ‘motif’ by Miller et al. [2], sensed across the multiple sensory neurons spread throughout the hand, that allows the brain to determine touch location on the tool. Miller et al. [2] exploited a well-known property of neural processing: Repeated stimulation with properties to which a neuron or a brain region is sensitive results in reduced neural activity, a phenomenon termed repetition suppression [5]. This technique makes it possible to test whether a given cognitive process or brain region regards two signals as ‘the same’, and when this process occurs in time. Miller et al. [2] presented participants with pairs of tactile stimuli on a hand-held tool or on the arm. The stimulus pair occurred at either one common or two different locations. The authors observed repetition suppression in electroencephalographic (EEG) activity for location repetition both on the arm and on the tool. What is more, these repetition suppression effects had a very similar temporal profile for touch on tool and arm. Moreover, source localization of the EEG signals to cortical regions suggested that repetition suppression arose in highly overlapping regions of primary somatosensory cortex (S1) and posterior parietal cortex for tool and hand.

Miller et al. [2] devised several statistical tests to ascertain that repetition suppression was indeed similar across the two localization modes; for instance, a classification algorithm trained to classify repetition suppression on the arm also identified repetition suppression for the tool, and classification was more likely to misidentify whether touch had occurred on the arm or tool than whether location had been repeated or not. In other words, stimulus location was a stimulus property processed by the identified brain regions independent of whether location information originated from the presumed place coding for the skin, or motif decoding for the tool (see Figure 1, upper callout). The idea that humans may ‘extend their body’ through tools has intrigued researchers for many years. What exactly this idea entails, however, has been less clear. For instance, we usually do not feel that a pen is part of our body, even if we experience that we are truly touching the paper with its tip. On the other side, we hope to create prostheses in a way that patients accept the artificial body part as their own. Some perceptual and sensorimotor functions appear to remap from the hand to the tool tip in humans [6,7], and tool use can result in changes of movement and representation of body morphology even after the tool is no longer used [8]. But such behavioral markers of ‘body extension’ by tool use may rely on separate brain functions specifically dedicated to tool processing. A much more convincing argument about ‘body extension’ could be made if one and the same brain network processes both body and tool. Indeed, when monkeys are trained to use a rake, parietal neurons that usually respond to visual stimuli around the hand now fire in

Current Biology 29, R1301–R1325, December 16, 2019 ª 2019 Elsevier Ltd. R1301

Current Biology

Dispatches S1 PPC

Activity

Activity

Arm Tool Different location Same location

Time

Different location same location

Time

Current Biology

Figure 1. Presumed neural mechanisms for perceiving touch location on skin and tools. Touch on the skin is presumed to depend on a neural place code: a given neuron’s mechanoreceptors respond to touch on a circumscribed region of the skin, and a touch on the body will activate a neuron only if it falls within the receptors’ receptive field (light blue stimulus in lower callout). Accordingly, touch location can be derived from which neurons are firing in response to the tactile event. When the hand holds the tool, the same receptors sense the vibrations of the tool, independently of where the tool was touched. Where the tool was touched derives from the tool’s specific vibratory pattern, or ‘motif’ [1] (illustrated as red and orange wave patterns) elicited by the touch, sensed by multiple neurons in the skin of the hand that holds the tool (red and orange mechanoreceptors in lower callout). Despite these vastly different neural mechanisms, cortical regions involved in tactile processing appear to extract common characteristics of spatial location from the two types of input, evident in similar suppressed responses for touch to repeated presentation of stimulus location on both body and tool (upper callout). Figure courtesy of Marta Beauchamp, used with permission.

R1302 Current Biology 29, R1301–R1325, December 16, 2019

response to visual stimuli around both hand and tool [9,10]. The results of Miller et al. [2] suggest that the human somatosensory system is flexibly re-used during tool use already in the initial cortical processing stages of S1 as well as in higher-level regions of parietal cortex. As the authors point out [2], their finding may call for a reconceptualization of the role of S1: this structure is currently viewed largely as a feature detector for the tactile modality, but the new results suggest it may extract higher-level sensorimotor concepts than previously thought — here, tactile location. Indeed, with Brodmann areas 3a, 3b, 1, and 2, S1 contains four distinct regions with increasingly complex neural properties whose precise functional significance remains to be determined. Moreover, S1 has been implicated in perceiving some illusory tactile-spatial percepts [11], even if a critical role of this structure appears unlikely for some kinds of illusory localization, such as confusion between different body parts [12]. As for any study, there are some caveats when interpreting the work of Miller et al. [2]. The somatosensory system comprises multiple receptor types with partly overlapping sensory specializations [4], so that there is likely overlap in the receptors that contribute to localizing on body and tools, and the binary conceptualization of tool versus body localization may be oversimplified. Nevertheless, the new study [2] opens up exciting, new views on somatosensation and body processing, and it provides a compelling experimental approach to further scrutinize these ideas. For instance, the present study required participants to hold the tool always in the same position. Thus, repetition of the tactile stimulus not only repeated stimulus location, but also all other aspects of the stimulation. A strong interpretation of the idea that the somatosensory system extracts location independent of the signals’ particular sensory origin is that stimuli should be processed as identical even if arm posture, hand grip, or tool orientation have changed between stimulations. These manipulations result in different vibratory motifs but nevertheless indicate identical touch locations on the tool. Another interesting aspect is that S1 and parietal cortex probably process different aspects of

Current Biology

Dispatches tactile location. For instance, S1 may be more closely related to the physical conditions of tool and hand and extract location in dependence of a particular grip, reminiscent of how it codes the body in close alignment to the neural organization into dermatomes in the periphery [13]. In contrast, parietal cortex may code the tool in space, similarly to how regular touch is projected from skin into space in this region [14–16]. Repetition suppression effects such as those observed by Miller et al. [2] should then occur whenever touch is repeated at a common spatial location, even if the tool is gripped differently or has been moved into a new arm-tool posture. That the brain codes the spatial location of touch on tools has indeed been suggested by previous research [6,17], similar as it is known to occur for events on the body [18]. It will be exciting to connect this previous work with the new paradigm by Miller et al. [2] to further unravel how far the equivalence of body and tool truly carries in the brain. REFERENCES 1. Miller, L.E., Montroni, L., Koun, E., Salemme, R., Hayward, V., and Farne`, A. (2018). Sensing with tools extends somatosensory processing beyond the body. Nature 561, 239–242.

2. Miller, L.E., Fabio, C., Ravenda, V., Bahmad, , J., S., Koun, E., Salemme, R., Luaute Bolognini, N., Hayward, V., and Farne`, A. (2019). Somatosensory cortex efficiently processes touch located beyond the body. Curr. Biol. 29, 4276–4283. 3. Sanchez Panchuelo, R.M., Ackerley, R., Glover, P.M., Bowtell, R.W., Wessberg, J., Francis, S.T., and McGlone, F. (2016). Mapping quantal touch using 7 Tesla functional magnetic resonance imaging and single-unit intraneural microstimulation. eLife 5, e12812.

11. Blankenburg, F., Ruff, C.C., Deichmann, R., Rees, G., and Driver, J. (2006). The cutaneous rabbit illusion affects human primary sensory cortex somatotopically. PLoS Biol. 4, e69. 12. Badde, S., Ro¨der, B., and Heed, T. (2019). Feeling a touch to the hand on the foot. Curr. Biol. 29, 1491–1497.

4. Abraira, V.E., and Ginty, D.D. (2013). The sensory neurons of touch. Neuron 79, 618–639.

13. Dietrich, C., Blume, K.R., Franz, M., Huonker, R., Carl, M., Preißler, S., Hofmann, G.O., Miltner, W.H.R., and Weiss, T. (2017). Dermatomal organization of SI leg representation in humans: revising the somatosensory homunculus. Cereb. Cortex 27, 4564–4569.

5. Grill-Spector, K., Henson, R., and Martin, A. (2006). Repetition and the brain: neural models of stimulus-specific effects. Trends Cogn. Sci. 10, 14–23.

14. Bolognini, N., and Maravita, A. (2007). Proprioceptive alignment of visual and somatosensory maps in the posterior parietal cortex. Curr. Biol. 17, 1890–1895.

6. Yamamoto, S., and Kitazawa, S. (2001). Sensation at the tips of invisible tools. Nat. Neurosci. 4, 979–980.

15. Azan˜o´n, E., Longo, M.R., Soto-Faraco, S., and Haggard, P. (2010). The posterior parietal cortex remaps touch into external space. Curr. Biol. 20, 1304–1309.

7. Maravita, A., and Iriki, A. (2004). Tools for the body (schema). Trends Cogn. Sci. 8, 79–86. 8. Cardinali, L., Frassinetti, F., Brozzoli, C., Urquizar, C., Roy, A.C., and Farne`, A. (2009). Tool-use induces morphological updating of the body schema. Curr. Biol. 19, R478–R479. 9. Iriki, A., Tanaka, M., and Iwamura, Y. (1996). Coding of modified body schema during tool use by macaque postcentral neurones. Neuroreport 7, 2325–2330. 10. Holmes, N.P., and Spence, C. (2004). The body schema and multisensory representation(s) of peripersonal space. Cogn. Process 5, 94–105.

16. Medendorp, W.P., and Heed, T. (2019). State estimation in posterior parietal cortex: distinct poles of environmental and bodily states. Prog. Neurobiol. 183, 101691. 17. Maravita, A., Spence, C., and Driver, J. (2003). Multisensory integration and the body schema: close to hand and within reach. Curr. Biol. 13, R531–R539. 18. Heed, T., and Ro¨der, B. (2010). Common anatomical and external coding for hands and feet in tactile attention: evidence from eventrelated potentials. J. Cogn. Neurosci. 22, 184–202.

Cellular Cognition: Sequential Logic in a Giant Protist Wallace F. Marshall Department of Biochemistry and Biophysics, University of California, San Francisco, San Francisco, CA 94122, USA Correspondence: [email protected] https://doi.org/10.1016/j.cub.2019.10.034

Quantitative analysis of the giant ciliate Stentor roeselii shows that a single cell can make decisions, based on the ability to switch between several different behaviors in a non-random order. Although philosophers might argue about it, anyone who owns a dog will agree that non-primate animals can think. Cats are a bit more questionable due to their screwy logic but still they show evidence of thought. As one proceeds downward in size to smaller animals, the ability to think, as judged by the ability to make decisions and learn from experience, continues to

be seen. This raises the question, how many cells does an organism need in order to think? How about just one? It sounds crazy to talk about a cell thinking, but cells make decisions all the time. Examples include the decision to enter the cell cycle or to undergo apoptosis. Even bacteriophages make decisions to undergo lysis or lysogeny. These

examples illustrate binary decisions to either take a specific irrevocable action, or not. Can cells go beyond such simple Hamlet-like decisions? A new study in this issue of Current Biology from Dexter et al., working with the ciliate Stentor, suggests this is indeed likely [1]. Herbert Spencer Jennings (1868– 1947), along with a number of his

Current Biology 29, R1301–R1325, December 16, 2019 ª 2019 Elsevier Ltd. R1303