Crossmodal Association of Visual and Haptic Material Properties of Objects in the Monkey Ventral Visual Cortex

Crossmodal Association of Visual and Haptic Material Properties of Objects in the Monkey Ventral Visual Cortex

Report Crossmodal Association of Visual and Haptic Material Properties of Objects in the Monkey Ventral Visual Cortex Highlights d The impact of vis...

1MB Sizes 0 Downloads 110 Views

Report

Crossmodal Association of Visual and Haptic Material Properties of Objects in the Monkey Ventral Visual Cortex Highlights d

The impact of visuo-haptic experience of objects was tested in the monkey visual cortex

d

Experience changed representation in the posterior inferior temporal cortex (PIT)

d

Activity in the PIT came to well reflect non-visual material properties of the objects

d

The PIT can learn and represent crossmodal association of visual and haptic properties

Goda et al., 2016, Current Biology 26, 928–934 April 4, 2016 ª2016 Elsevier Ltd All rights reserved http://dx.doi.org/10.1016/j.cub.2016.02.003

Authors Naokazu Goda, Isao Yokoi, Atsumichi Tachibana, Takafumi Minamimoto, Hidehiko Komatsu

Correspondence [email protected]

In Brief Goda et al. find that visuo-haptic experience of objects has a great impact on the representation of objects’ property (e.g., hardness) in the posterior inferior temporal cortex in monkeys, providing evidence that the crossmodal association of objects’ property is implicitly learned and represented in the ventral visual cortex in primates.

Current Biology

Report Crossmodal Association of Visual and Haptic Material Properties of Objects in the Monkey Ventral Visual Cortex Naokazu Goda,1,2,* Isao Yokoi,1,2 Atsumichi Tachibana,3 Takafumi Minamimoto,4 and Hidehiko Komatsu1,2 1Division

of Sensory and Cognitive Information, National Institute for Physiological Sciences, Okazaki 444-8585, Japan of Physiological Sciences, SOKENDAI (The Graduate University for Advanced Studies), Okazaki 444-8585, Japan 3Department of Histology and Neurobiology, Dokkyo Medical University, Tochigi 321-0293, Japan 4Department of Molecular Neuroimaging, National Institute of Radiological Sciences, Chiba 263-8555, Japan *Correspondence: [email protected] http://dx.doi.org/10.1016/j.cub.2016.02.003 2Department

SUMMARY

Just by looking at an object, we can recognize its nonvisual properties, such as hardness. The visual recognition of non-visual object properties is generally accurate [1], and influences actions toward the object [2]. Recent studies suggest that, in the primate brain, this may involve the ventral visual cortex, which represents objects in a way that reflects not only visual but also non-visual object properties, such as haptic roughness, hardness, and weight [3–7]. This new insight raises a fundamental question: how does the visual cortex come to represent non-visual properties—knowledge that cannot be acquired directly through vision? Here we addressed this unresolved question using fMRI in macaque monkeys. Specifically, we explored whether and how simple visuohaptic experience—just seeing and touching objects made of various materials—can shape representational content in the visual cortex. We measured brain activity evoked by viewing images of objects before and after the monkeys acquired the visuo-haptic experience and decoded the representational space from the activity patterns [8]. We show that simple long-term visuo-haptic experience greatly impacts representation in the posterior inferior temporal cortex, the higher ventral visual cortex. After the experience, but not before, the activity pattern in this region well reflected the haptic material properties of the experienced objects. Our results suggest that neural representation of non-visual object properties in the visual cortex emerges through long-term crossmodal exposure to objects. This highlights the importance of unsupervised learning of crossmodal associations through everyday experience [9–12] for shaping representation in the visual cortex. RESULTS Two macaque monkeys were visuo-haptically exposed to rodshaped real objects made of 36 materials (9 categories; 4 exem-

plars per category) through a simple behavioral task performed over 2 months (Figure 1A). The materials included several with which the monkeys were naturally familiar or were exposed to in the animal facilities (e.g., fur and metal) [13], as well as others that were highly artificial and unfamiliar to them (e.g., ceramics, surface-finishing stones, colored glass, and fabrics). During the task, the monkey had to reach for, grasp, and pull the object to get a reward. The monkeys were not required to discriminate or memorize the visual or haptic properties of the objects during the task. Before and after this long-term visuo-haptic experience, the two monkeys were scanned while they viewed images of the 36 objects used for the behavioral task (Figure 1B, set A; see Figure S1 for an enlarged version). The images were presented using a block design; one scanning run consisted of nine category blocks (about 15 s per block), in each of which four exemplars in the same category were presented, interleaved with fixation-only blocks (see the Supplemental Experimental Procedures). In addition, the monkeys were scanned while viewing images of additional exemplars (Figure 1B, set B) from the same nine material categories but which were not used for the behavioral task. This enabled us to examine whether the effect of the visuo-haptic experience, if any, was specific to the samples used during the behavioral task or could be generalized to the other exemplars in the same category. For all scans, each monkey performed a fixation task so as to make negligible the effect of top-down demands on neural activity. We then decoded the representational space of the nine material categories by applying representational similarity analysis [8] to the local fMRI activity patterns obtained before and after experience. We estimated the neural representational dissimilarity matrix (neural RDM), which is a summary of the neural similarity/dissimilarity across material categories and can be used to infer the content of information represented in a given region. Here we focused on the representational content within the traditionally unisensory visual areas V1, V2, V3, V4 and the PIT (posterior inferior temporal [IT] cortex), which do not receive somatosensory inputs directly. These regions of interest (ROIs) were chosen based on our earlier study [3], which revealed that V4 and the PIT represent the material properties of objects (Figure 2A). For each ROI, we evaluated the dissimilarity between each pair of categories as the Euclidean distance between a pair of activity patterns [3, 4].

928 Current Biology 26, 928–934, April 4, 2016 ª2016 Elsevier Ltd All rights reserved

A

Metal (Me)

Ceramic (Ce)

A

Glass (Gl)

IPS

2 4

Stone (St)

Bark (Ba)

LuS

V1 V2 V3 V4 PIT

Wood (Wo)

IOS

D STS

P

Set A, after-experience Leather (Le)

Fabric (Fa)

A V

Fur (Fu)

B

C

V1: Set A

neural RDM Me Ce Gl St Ba Wo Le Fa Fu Me Gl Ba Le Fu Ce St Wo Fa

Set B

Metal Ceramic Glass

D

Stone Bark Wood Leather Fabric

30

17

dissimilarity

Set A

Correlation between before- and after-experience

B

neural dissimilarity (after)

(V1: Set A, after-experience) 40

r = 0.871

30

20

20

30

*** 0.8

40

neural dissimilarity (before)

***

** **

0.6

Set A Set B

*** ** **

0.4 0.2 0

V1

V2

V3

V4

PIT

Figure 2. Comparison between Neural Representations before and after Experience

Fur

Figure 1. Experimental Design (A) Rod-shaped, real-object stimuli used for the monkey behavioral task. The surfaces of the objects (diameter 17.7–28.0 mm) were made from nine categories of materials (four exemplars per category). The inset shows an example scene in which a monkey is grasping a glass object. The monkeys first had to fixate on the top of the object and then to reach for, grasp, and pull the object with their hands. They were allowed to see the whole object after the first fixation until the end of the pulling. Each monkey performed the task with nine objects (one exemplar from each of the nine categories; typically 30 trials for each object) in one daily session, and a total of 32–44 sessions across 2 months. See the Supplemental Experimental Procedures. (B) Visual images used for the monkey fMRI experiments. Set A contains images of the objects used for the behavioral task (shown in A), and set B contains the images of four other exemplars from each of the nine categories. These images were presented during fMRI scans in a central 9.2 3 9.2 visual field within which the object subtended 2.2 –3.5 in width and 9.2 in height on a gray background (duration 500 ms, inter-stimulus interval >1,000 ms, twice per exemplar in each category block). See also Figure S1.

These were then averaged across the four hemispheres from the two monkeys to obtain a group-averaged neural RDM (Figure 2B).

(A) Distributions of voxels used to analyze the neural representation of set A after experience (V1: green; V2: cyan; V3: blue; V4: purple; PIT: red). Each of the five ROIs contained the 500 most visually responsive voxels per hemisphere, determined using an independent dataset (i.e., a dataset excluding the data for set A after experience), within an areal boundary defined previously (see the Supplemental Experimental Procedures). Each color scale denotes the number of overlaps across four hemispheres (voxels in the left hemispheres are flipped). For clarity, only voxels overlapping across at least two hemispheres are shown. A, anterior; D, dorsal; IOS, inferior occipital sulcus; IPS, intraparietal sulcus; LuS, lunate sulcus; P, posterior; STS, superior temporal sulcus; V, ventral. (B) Representational dissimilarity matrix obtained from the V1 activity pattern (neural RDM) for set A after experience. The activity pattern was estimated for each material category from the fMRI data (40 scanning runs per image set, both before and after experience) using a voxel-wise general linear model analysis (see the Supplemental Experimental Procedures). The color scale indicates the dissimilarity between category pairs (Euclidean distance of the activity patterns [beta values]). Lighter colors indicate the pair is more dissimilar. See Figure 1A for abbreviations of materials. (C) Scatterplot showing the relationship between the neural RDMs from V1 for set A before and after experience. Each point represents dissimilarity between a category pair. The Spearman correlation coefficient (r) is indicated. Error bars indicate the SEM across four hemispheres. See also Figures S2A and S2B. (D) Correlation between the neural RDMs before and after experience for each of five ROIs (set A: yellow; set B: blue). **p < 0.01, ***p < 0.001 (one-tailed permutation test, uncorrected).

Current Biology 26, 928–934, April 4, 2016 ª2016 Elsevier Ltd All rights reserved 929

A 3

A B

Fa

Fu

Metal Ceramic Glass Stone Bark Wood Leather Fabric Fur

Le Ce 0

Me

Wo Gl

St

Ba

-3

-3

0

3

B

C

PIT: Set A

material RDM

9

1

dissimilarity

Me Ce Gl St Ba Wo Le Fa Fu Me Gl Ba Le Fu Ce St Wo Fa

neural dissimilarity

(Set A)

before

after

r = 0.114

r = 0.551

25

25

20

20

15

15

0

10

0

10

dissimilarity in material property

D Correlation with material property

0.6

before-experience after-experience

*** * *

0.4 0.2 0

V1

V2

V3

of the dissimilarities tends to remain unchanged, although the overall level of the dissimilarities tends to decrease after the experience (Figure 2C; see also Figure S2A for the two-dimensional representational spaces derived from multi-dimensional scaling [MDS] analysis). Based on this observation, we evaluated the Spearman rank correlation (r) between the neural RDMs obtained before and after experience as an index of the congruence between them. The results revealed that for both image sets, the neural RDMs before and after experience were highly correlated in V1, V2, and V4 (r > 0.554, p < 0.005, one-tailed permutation test; Figure 2D), indicating that the representations in these areas are stable. The long-term stability also ensures that the estimates of the neural RDMs are reliable and reproducible across two experiments conducted with long time intervals in between (more than 12 months). In contrast to the areas mentioned above, the neural RDMs in the PIT for both image sets and in V3 for set A differed between before and after experience (Figure 2D; see also Figure S2B). Because the visual images were identical, the difference observed in these areas is attributed to non-visual factors, possibly related to experience.

V4

PIT

Figure 3. Representational Similarities between Neural Activities and the Material Properties of Objects (A) Scatterplot for 72 images (set A: filled symbols; set B: open symbols) in a two-dimensional space derived from MDS analysis (nonmetric, stress 0.096 for the two dimensions) of visual ratings of material properties by human participants. See Figure 1A for abbreviations of materials. (B) Representational dissimilarity matrix of the material properties (material RDM) obtained from humans’ visual rating data for set A. The color scale indicates the dissimilarity between category pairs (Euclidean distance of the rating scores averaged for samples within each category). (C) Scatterplots showing the relationship between the neural RDM obtained from the PIT and the material RDM (B) for set A (before experience: left; after experience: right). Each point represents dissimilarity between a category pair. The Spearman correlation coefficient is indicated. Error bars indicate the SEM across four hemispheres. See also Figures S2B and S2C. (D) Correlation between the neural RDMs obtained from the five ROIs and the material RDM (B) for set A (before experience: green; after experience: red). See also Figure S3. *p < 0.05, ***p < 0.001 (one-tailed permutation test, uncorrected).

Effect of Visuo-haptic Experience on Neural Representation We first examined whether experience modified neural RDMs. The comparison of neural RDMs in V1 obtained before and after the subjects gained experience suggests that the order relation

Neural Representation of Material Properties We next investigated how experience affected neural representation. Here we hypothesized that the representation of materials after gaining visuo-haptic experience would more strongly reflect the visuo-haptic properties of the material, such as roughness or hardness. To assess this possibility, we compared the neural RDMs with a representational dissimilarity matrix for the visuo-haptic material properties, which we call the material RDM. We evaluated the material RDM by conducting a psychological experiment with humans, who would already have rich experience with the nine materials (assuming general inter-species commonality of visuo-haptic perception of experienced materials [13, 14]). Briefly, we asked the human participants (n = 12) to view the images and rate their impressions of the materials using 12 bipolar adjective scales [4], including matte-glossy, smooth-rough, soft-hard, and light-heavy (see Table S1 for all adjective pairs). The obtained ratings tended to cluster for each category [1, 4] (Figure 3A). We separately obtained a material RDM for each image set by computing the Euclidean distances of the ratings between category pairs (Figure 3B). Then we asked whether the neural RDMs for the exemplars visuo-haptically exposed during the behavioral task (set A) were related to the material RDM. We found that experience affected the correlation between the neural RDMs and the material RDM. Remarkably, the neural RDMs in the PIT, which did not show a correlation before the subjects gained the experience, highly correlated with the material RDM after the experience (before: r = 0.114, p = 0.272; after: r = 0.551, p = 0.0008; one-tailed permutation test; Figures 3C and 3D; see also Figure S2C). The correlation in the PIT after the experience remained significant after correcting for multiple comparisons across five ROIs and two conditions (p = 0.021, maximum-statistic method), and was also significantly higher than that before the experience (p = 0.004, one-tailed bootstrap test). Moreover, a similar pattern of results was observed when the neural RDMs were evaluated using a different neural dissimilarity measure (see the Supplemental Experimental Procedures; Figures S3A and S3B), and

930 Current Biology 26, 928–934, April 4, 2016 ª2016 Elsevier Ltd All rights reserved

A

visual

non-visual

non-visual (haptically-estimated)

0

B

all

non-visual

visual

non-visual (haptically-estimated)

V4 (Set A)

Correlation with material property

before

PIT (Set A)

after

**

0.6

before

* * * *

0.4

**

0.4 0.2

0

0

-0.2

-0.2

V4 (Set B)

Correlation with material property

** **

0.2

before

after ***

0.6

*

C

dissimilarity

7

PIT (Set B)

after

before

after

0.6

0.6 *

*

0.4

0.4

0.2

0.2

0

0

-0.2

-0.2

Figure 4. Representation of Visual and Non-visual Material Properties in V4 and the PIT (A) Visual and non-visual material RDMs estimated from humans’ visual rating data for set A (left and middle), and the haptically estimated non-visual material RDM for set A (right). The format is the same as in Figure 3B. The haptically estimated RDM was derived from haptic rating experiments, in which 12 blindfolded human participants were asked to touch 36 objects and rate their impressions of the materials using nine bipolar adjective scales (excluding three visual adjective pairs; Table S1). (B) Correlation between the neural RDMs and each of the three material RDMs shown in (A) (visual: light blue bars; non-visual: green; haptically estimated: yellow) for set A. The correlation with the original material RDM evaluated using all adjective pairs (Figure 3B) is also shown for comparison (dark blue bars: replots of Figure 3D). See also Figure S4A. (C) The same analysis as in (B) for set B. The correlation with the haptically estimated material RDM is not shown because the haptic ratings were conducted only for the objects in set A. See also Figure S4B. *p < 0.05, **p < 0.01, ***p < 0.001 (one-tailed permutation test, uncorrected).

even when data from one category were excluded (Figures S3C and S3D). These findings indicate that the representation in the PIT was altered by experience and came to well reflect the visuo-haptic material properties of the exemplars. In V4, the neural RDMs significantly correlated with the material RDM both before and after experience (before: r = 0.404, p = 0.019; after: r = 0.421,

p = 0.016; Figure 3D), and these correlations were not significantly different (p = 0.141). These results suggest that representation of material properties in V4 is relatively stable (although possibly affected by experience; see Figures S3A and S3B). Other areas, including V3, did not show significant correlation with the material RDM (p > 0.069) or significant difference in the correlation between before and after experience (p > 0.168). Thus, the effect of experience observed in V3 (Figure 2D) would not be related to a change in the representation of the visuo-haptic material properties. Visual and Non-visual Aspects of the Material Property We also assessed whether the neural RDMs in V4 and the PIT, where correlation with the material RDM was observed after experience, reflect the non-visual aspects of the material properties. For this purpose, we estimated the material RDMs representing mainly (but not exclusively) the visual and non-visual properties using data from the humans’ ratings with ‘‘visual’’ and ‘‘non-visual’’ adjective scales [4] (Table S1). The analysis with these material RDMs (Figure 4A, left and middle) revealed that the neural RDMs in the PIT strongly and reliably correlated with both material RDMs after experience (visual: r = 0.562, p = 0.003; non-visual: r = 0.448, p = 0.003; Figure 4B; see also Figure S4A for the consistent results obtained with a different neural dissimilarity measure). The increase in the correlation after experience was significant or marginally significant (visual: p < 0.004; non-visual: p = 0.059). This supports the notion that the PIT represents both visual and non-visual aspects of the material properties in a flexible way, depending on prior experience. The neural RDMs in V4 correlated with only the visual material RDM before experience (visual: r = 0.586, p = 0.008) but with both the visual and non-visual material RDMs after experience (visual: r = 0.398, p = 0.039; non-visual: r = 0.389, p = 0.020). The correlation with the non-visual material RDM significantly increased after experience (visual: p = 0.813; non-visual: p = 0.004). These findings suggest that the representational content in V4 could also change with experience. Although the analysis described above assumes that humans can veridically estimate non-visual material properties such as hardness just by viewing the images [1], there could be discrepancies between the visually estimated haptic properties and the actual ones. We therefore estimated the non-visual material RDM in a more direct way by asking the participants for haptic ratings. We found that the results obtained using the haptically estimated material RDM (Figure 4A, right) were similar to those obtained with the visually estimated non-visual material RDM. The correlation with the haptically estimated material RDM after experience was significant in V4 and the PIT (V4 after: r = 0.406, p = 0.015; PIT after: r = 0.502, p = 0.001; Figure 4B), and in the PIT was significantly higher than the correlation before experience (V4: p = 0.082; PIT: p = 0.004). This confirms that experience affects the representation of non-visual material properties in the PIT (and possibly in V4 as well). Category Generalization Finally, we asked whether the effect of experience transfers to the other exemplars in the same category by testing whether the neural RDMs in V4 and the PIT for set B correlated with the material RDMs (for set B). The results showed that neural

Current Biology 26, 928–934, April 4, 2016 ª2016 Elsevier Ltd All rights reserved 931

RDMs in these areas did not correlate with the material RDM for set B after experience (Figure 4C; see also Figure S4B), although the correlation with the non-visual material RDM significantly increased after experience (visual: p = 0.648 and 0.383; nonvisual: p = 0.004 and 0.035; for V4 and the PIT, respectively). This pattern of results suggests that the effect of experience is not simply generalized across exemplars within a category. This lack of category generalization suggests that the effect of experience is not category based but rather exemplar based. In V4, the pattern of the correlation for set B before and after experience was similar to that for set A before experience. In these three conditions, the correlation with the visual material RDM was significant (set B before: r = 0.456, p = 0.045; set B after: r = 0.408, p = 0.029; Figure 4C; see also Figure S4B) and was significantly higher than the correlation with the non-visual material RDM (p = 0.004, 0.004, and 0.035 for set A before, set B before, and set B after, respectively). This trend suggests that V4 predominantly represents visual aspects of the material properties. DISCUSSION Representation of Non-visual Information in the Ventral Visual Cortex of Primates Earlier studies in monkeys reported visuo-tactile multisensory neurons in the prefrontal, somatosensory, posterior parietal, and superior temporal regions of the cortex [15, 16]. To our knowledge, however, neither single-neuron nor fMRI activity in response to tactile/haptic stimuli has been reported in the ventral visual cortex [17], except for task-related responses in V4 during a tactile-visual matching task [18]. Our findings, together with our earlier results [3], provide new evidence that the PIT can represent haptic-related information about visually presented objects. This is consistent with recent findings in humans indicating that the ventral visual cortex (ventral occipitotemporal cortex) represents information about non-visual object properties [4–7]. This or even lower visual areas are also responsive when humans are making haptic judgments about shapes [19, 20] and textures [21–24]. These findings suggest that the ventral visual cortex of humans and nonhuman primates represents information about objects more supramodally than previously thought. Moreover, our results provide the first direct neurophysiological evidence that neural representation is shaped through simple visuo-haptic exposure to objects—i.e., repeatedly seeing and touching them—without explicit, supervised training to memorize specific visuo-haptic associations. Thus, representation of non-visual object properties in the ventral visual cortex would emerge through implicit, unsupervised learning of visuo-haptic associations during everyday interaction with objects. It is well known that when monkeys are trained to discriminate/ memorize particular visual stimuli, or are familiarized with those visual stimuli, the response properties of neurons in the monkey IT change depending on the visual experience [25, 26]. One may therefore argue that the observed effect is caused by visual, not haptic, experience. Our detailed analysis, however, has shown that the neural RDM after experience strongly correlates with the haptically estimated RDM in the PIT (Figure 4B). It is unlikely that this reflects visual experience alone. Moreover, some studies have shown that fMRI activity patterns are immune to visual training and familiarization in adult monkeys [27, 28],

although the activity magnitudes may change [29]. Taken together, representational changes in the PIT would be, at least in part, crossmodal effects caused by simultaneous exposure to the visual and haptic features of objects. Transformation from Image Features to Supramodal Material Representation Recent studies suggest that a visual image of an object is rich in information about material properties, and it is possible to predict non-visual material properties such as roughness, hardness, and coldness by appropriately using image features [30–32]. Neurons in the higher visual areas may represent image features that strongly correlate with the non-visual material properties of objects. Such information may be acquired through everyday experience by learning crossmodal statistical regularity, such as the co-occurrence relationships between visual and non-visual features [9–12]. On the basis of this idea and earlier findings, we suggest the following scenario for how the monkey visual cortex can represent non-visual material properties. Along the ventral visual pathway through V2 [33], V4 [34, 35], the PIT, and the more anterior IT [36–38], visual features are progressively transformed to those important for material recognition/ categorization [3, 39, 40]. In V4, neurons represent high-level statistical visual features diagnostic of materials [35], but these are only loosely related to the non-visual material properties. Visual features better correlated with non-visual properties would be represented at the level of the PIT. This representation reflects statistical crossmodal associations learned without supervision through long-term visuo-haptic exposure. Because, to our knowledge, there is no direct pathway from the somatosensory areas to the PIT, this learning likely involves other areas such as the posterior parietal and medial temporal lobes, as has been suggested for humans [41]. Presumably, long-term learning shapes the representation in the PIT by modifying feedback influences from those areas to the PIT and/or feedforward transformation from V4 to the PIT. Within this framework, the observed lack of category generalization may be explained by the overlearning of the visuo-haptic association. There would be plenty of visual features that could potentially be related to haptic features, although the number of samples used for the learning in the present study is not large enough for generalization. In such a situation, learned visuohaptic associations would be optimal for visuo-haptically exposed objects but unreliable for new samples. This possibility will be validated in the future by testing for generalization using much larger numbers of samples. In the present study, the PIT did not reflect the material properties of objects before experience, but it did in our earlier study [3]. We suppose this discrepancy might be partly attributable to the typicality of the exemplars. The objects used here included unfamiliar and non-typical exemplars (e.g., surface-finishing stones), whereas computer-graphics images with more typical (and even ‘‘caricatured’’) appearances were used in the earlier study. Consequently, the association learned through prior experience might not be optimal for the objects used in the present study. Other factors potentially related to the discrepancy are the size and shape of the objects; the present study used images of thinner rods with less-controlled shape than the earlier ones. Under these conditions, material-related responses to

932 Current Biology 26, 928–934, April 4, 2016 ª2016 Elsevier Ltd All rights reserved

the object surface might be obscured by the object contours, especially in the PIT, where neurons generally have large receptive fields and complex shape tunings. The other issue that remains unresolved is the effect of experience on V3. One possibility is that the effect observed in V3 mainly reflects changes in the representation of 3D shape, one of the object properties important for grasping behavior. Monkey V3/V3A is highly sensitive to 3D shape [42, 43], and is thought to be an important link to the action-related, multimodal areas in the parietal cortex [44]. It would thus be of great interest in the future to know how visuo-haptic experience impacts V3 and the parietal action-related network. In summary, we demonstrated for the first time that simple visuo-haptic experience with real-world materials shapes neural representations in the posterior part of the monkey IT, which does not receive somatosensory information directly. We suggest that this reflects statistical visuo-haptic crossmodal associations learned through simultaneous exposure to visual and haptic features. This finding provides insight into how supramodal, and possibly more conceptual/semantic, knowledge about objects [45] is acquired in the visual cortex of our brain. SUPPLEMENTAL INFORMATION Supplemental Information includes Supplemental Experimental Procedures, four figures, and one table and can be found with this article online at http:// dx.doi.org/10.1016/j.cub.2016.02.003. AUTHOR CONTRIBUTIONS N.G., I.Y., T.M., and H.K. designed the research. I.Y. prepared real object stimuli. N.G. and A.T. performed fMRI experiments. I.Y. and A.T. performed behavioral experiments. N.G. performed human experiments. N.G., I.Y., and A.T. analyzed data. N.G., I.Y., A.T., T.M., and H.K. interpreted data. N.G. and H.K. wrote the manuscript. ACKNOWLEDGMENTS We greatly thank T. Ohta for assistance with animal training and data collection, K. Takenaka, R. Kitada, and N. Sadato for recruitment of the participants, K. Matsuda for providing the eye-tracking software, and M. Takagi for technical assistance. We also thank all those providing the real objects: E. Fujiwara and members of Glass-koubou-AOI for glass objects, members of Higashi Park in Okazaki for bark objects, H. Niimi for stone objects, Y. Oka and H. Kobayakawa at the World Children’s Art Museum in Okazaki for ceramic objects, and members of the Equipment Development Center at the Institute for Molecular Science for prototyping metal objects. This study was supported by JSPS KAKENHI grants 25330179 (to N.G.), 26330323 (to I.Y.), and 22135007 and 15H05916 (to N.G. and H.K.). Informed written consent was obtained from all human participants, and the protocol for the human experiments was approved by the local ethics committee at the National Institute for Physiological Sciences. The protocol for animal care and experimentation was in accordance with ILAR guidelines and approved by the Animal Experiment Committee of the National Institutes of Natural Sciences. Received: October 30, 2015 Revised: December 15, 2015 Accepted: February 1, 2016 Published: March 17, 2016

2. Buckingham, G., Cant, J.S., and Goodale, M.A. (2009). Living in a material world: how visual cues to material properties affect the way that we lift objects and perceive their weight. J. Neurophysiol. 102, 3111–3118. 3. Goda, N., Tachibana, A., Okazawa, G., and Komatsu, H. (2014). Representation of the material properties of objects in the visual cortex of nonhuman primates. J. Neurosci. 34, 2660–2673. 4. Hiramatsu, C., Goda, N., and Komatsu, H. (2011). Transformation from image-based to perceptual representation of materials along the human ventral visual pathway. Neuroimage 57, 482–494. 5. Cant, J.S., and Goodale, M.A. (2011). Scratching beneath the surface: new insights into the functional properties of the lateral occipital area and parahippocampal place area. J. Neurosci. 31, 8248–8258. 6. Gallivan, J.P., Cant, J.S., Goodale, M.A., and Flanagan, J.R. (2014). Representation of object weight in human ventral visual cortex. Curr. Biol. 24, 1866–1873. 7. Eck, J., Kaas, A.L., Mulders, J.L., Hausfeld, L., Kourtzi, Z., and Goebel, R. (2016). The effect of task instruction on haptic texture processing: the neural underpinning of roughness and spatial density perception. Cereb. Cortex 26, 384–401. 8. Kriegeskorte, N., and Kievit, R.A. (2013). Representational geometry: integrating cognition, computation, and the brain. Trends Cogn. Sci. 17, 401–412. 9. Seitz, A.R., Kim, R., van Wassenhove, V., and Shams, L. (2007). Simultaneous and independent acquisition of multisensory and unisensory associations. Perception 36, 1445–1453. 10. Ernst, M.O. (2007). Learning to integrate arbitrary signals from vision and touch. J. Vis. 7, 7.1–14. 11. Flanagan, J.R., Bittner, J.P., and Johansson, R.S. (2008). Experience can change distinct size-weight priors engaged in lifting objects and judging their weights. Curr. Biol. 18, 1742–1747. 12. Spence, C. (2011). Crossmodal correspondences: a tutorial review. Atten. Percept. Psychophys. 73, 971–995. 13. Hiramatsu, C., and Fujita, K. (2015). Visual categorization of surface qualities of materials by capuchin monkeys and humans. Vision Res. 115 (Pt A), 71–82. 14. Weber, A.I., Saal, H.P., Lieber, J.D., Cheng, J.-W., Manfredi, L.R., Dammann, J.F., III, and Bensmaia, S.J. (2013). Spatial and temporal codes mediate the tactile perception of natural textures. Proc. Natl. Acad. Sci. USA 110, 17107–17112. 15. Ghazanfar, A.A., and Schroeder, C.E. (2006). Is neocortex essentially multisensory? Trends Cogn. Sci. 10, 278–285. 16. Driver, J., and Noesselt, T. (2008). Multisensory interplay reveals crossmodal influences on ‘sensory-specific’ brain regions, neural responses, and judgments. Neuron 57, 11–23. 17. Guipponi, O., Cle´ry, J., Odouard, S., Wardak, C., and Ben Hamed, S. (2015). Whole brain mapping of visual and tactile convergence in the macaque monkey. Neuroimage 117, 93–102. 18. Haenny, P.E., Maunsell, J.H.R., and Schiller, P.H. (1988). State dependent activity in monkey visual cortex. II. Retinal and extraretinal factors in V4. Exp. Brain Res. 69, 245–259. 19. Amedi, A., von Kriegstein, K., van Atteveldt, N.M., Beauchamp, M.S., and Naumer, M.J. (2005). Functional imaging of human crossmodal identification and object recognition. Exp. Brain Res. 166, 559–571. 20. Lacey, S., and Sathian, K. (2014). Visuo-haptic multisensory object recognition, categorization, and representation. Front. Psychol. 5, 730. 21. Stilla, R., and Sathian, K. (2008). Selective visuo-haptic processing of shape and texture. Hum. Brain Mapp. 29, 1123–1138. 22. Sathian, K., Lacey, S., Stilla, R., Gibson, G.O., Deshpande, G., Hu, X., Laconte, S., and Glielmi, C. (2011). Dual pathways for haptic and visual perception of spatial and texture information. Neuroimage 57, 462–475.

REFERENCES 1. Baumgartner, E., Wiebel, C.B., and Gegenfurtner, K.R. (2013). Visual and haptic representations of material properties. Multisens. Res. 26, 429–455.

23. Eck, J., Kaas, A.L., and Goebel, R. (2013). Crossmodal interactions of haptic and visual texture information in early sensory cortex. Neuroimage 75, 123–135.

Current Biology 26, 928–934, April 4, 2016 ª2016 Elsevier Ltd All rights reserved 933

24. Podrebarac, S.K., Goodale, M.A., and Snow, J.C. (2014). Are visual texture-selective areas recruited during haptic texture discrimination? Neuroimage 94, 129–137.

36. Ko¨teles, K., De Mazie`re, P.A., Van Hulle, M., Orban, G.A., and Vogels, R. (2008). Coding of images of materials by macaque inferior temporal cortical neurons. Eur. J. Neurosci. 27, 466–482.

25. Kourtzi, Z., and DiCarlo, J.J. (2006). Learning and neural plasticity in visual object recognition. Curr. Opin. Neurobiol. 16, 152–158.

37. Nishio, A., Goda, N., and Komatsu, H. (2012). Neural selectivity and representation of gloss in the monkey inferior temporal cortex. J. Neurosci. 32, 10780–10793.

26. Op de Beeck, H.P., and Baker, C.I. (2010). The neural basis of visual object learning. Trends Cogn. Sci. 14, 22–30. 27. Op de Beeck, H.P., Deutsch, J.A., Vanduffel, W., Kanwisher, N.G., and DiCarlo, J.J. (2008). A stable topography of selectivity for unfamiliar shape classes in monkey inferior temporal cortex. Cereb. Cortex 18, 1676–1694. 28. Srihasam, K., Mandeville, J.B., Morocz, I.A., Sullivan, K.J., and Livingstone, M.S. (2012). Behavioral and anatomical consequences of early versus late symbol training in macaques. Neuron 73, 608–619.

38. Nishio, A., Shimokawa, T., Goda, N., and Komatsu, H. (2014). Perceptual gloss parameters are encoded by population responses in the monkey inferior temporal cortex. J. Neurosci. 34, 11143–11151. 39. Okazawa, G., Goda, N., and Komatsu, H. (2012). Selective responses to specular surfaces in the macaque visual cortex revealed by fMRI. Neuroimage 63, 1321–1333.

29. Adab, H.Z., Popivanov, I.D., Vanduffel, W., and Vogels, R. (2014). Perceptual learning of simple stimuli modifies stimulus representations in posterior inferior temporal cortex. J. Cogn. Neurosci. 26, 2187–2200.

40. Orban, G.A., Zhu, Q., and Vanduffel, W. (2014). The transition in the ventral stream from feature to real-world entity representations. Front. Psychol. 5, 695.

30. Abe, T., Okatani, T., and Deguchi, K. (2012). Recognizing surface qualities from natural images based on learning to rank. ICPR 2012 Organizing Committee, ed. Proceedings of the 21st International Conference on Pattern Recognition, pp. 3712–3715.

41. Kitada, R., Sasaki, A.T., Okamoto, Y., Kochiyama, T., and Sadato, N. (2014). Role of the precuneus in the detection of incongruency between tactile and visual texture information: a functional MRI study. Neuropsychologia 64, 252–262.

31. Schwartz, G., and Nishino, K. (2013). Visual material traits: recognizing per-pixel material context. IEEE Computer Society, ed. Proceedings of the 2013 IEEE International Conference on Computer Vision Workshops, pp. 883–890. 32. Giesel, M., and Zaidi, Q. (2013). Frequency-based heuristics for material perception. J. Vis. 13, 7.1–19. 33. Freeman, J., Ziemba, C.M., Heeger, D.J., Simoncelli, E.P., and Movshon, J.A. (2013). A functional and perceptual signature of the second visual area in primates. Nat. Neurosci. 16, 974–981.

42. Tsao, D.Y., Vanduffel, W., Sasaki, Y., Fize, D., Knutsen, T.A., Mandeville, J.B., Wald, L.L., Dale, A.M., Rosen, B.R., Van Essen, D.C., et al. (2003). Stereopsis activates V3A and caudal intraparietal areas in macaques and humans. Neuron 39, 555–568. 43. Nelissen, K., Joly, O., Durand, J.-B., Todd, J.T., Vanduffel, W., and Orban, G.A. (2009). The extraction of depth structure from shading and texture in the macaque brain. PLoS ONE 4, e8306–e8311.

34. Arcizet, F., Jouffrais, C., and Girard, P. (2008). Natural textures classification in area V4 of the macaque monkey. Exp. Brain Res. 189, 109–120.

44. Nelissen, K., and Vanduffel, W. (2011). Grasping-related functional magnetic resonance imaging brain responses in the macaque monkey. J. Neurosci. 31, 8220–8229.

35. Okazawa, G., Tajima, S., and Komatsu, H. (2015). Image statistics underlying natural texture selectivity of neurons in macaque V4. Proc. Natl. Acad. Sci. USA 112, E351–E360.

45. Carlson, T.A., Simmons, R.A., Kriegeskorte, N., and Slevc, L.R. (2014). The emergence of semantic meaning in the ventral temporal pathway. J. Cogn. Neurosci. 26, 120–131.

934 Current Biology 26, 928–934, April 4, 2016 ª2016 Elsevier Ltd All rights reserved