Spatial patterns in tactile perception: Is there a tactile field?

Spatial patterns in tactile perception: Is there a tactile field?

Acta Psychologica 137 (2011) 65–75 Contents lists available at ScienceDirect Acta Psychologica j o u r n a l h o m e p a g e : w w w. e l s ev i e r...

543KB Sizes 11 Downloads 114 Views

Acta Psychologica 137 (2011) 65–75

Contents lists available at ScienceDirect

Acta Psychologica j o u r n a l h o m e p a g e : w w w. e l s ev i e r. c o m / l o c a t e / a c t p s y

Spatial patterns in tactile perception: Is there a tactile field? Patrick Haggard a,⁎, Giulia Giovagnoli b,c a b c

Institute of Cognitive Neuroscience, University College London, UK Centro studi e ricerche in Neuroscienze Cognitive, Cesena, Italy Department of Psychology, University of Bologna, Italy

a r t i c l e

i n f o

Article history: Received 6 May 2010 Received in revised form 26 February 2011 Accepted 2 March 2011 Available online 5 April 2011 PsycINFO classification: 2320 Keywords: Touch Perception Human Grouping Pattern Hyperacuity

a b s t r a c t Previous studies of tactile spatial perception focussed either on a single point of stimulation, on local patterns within a single skin region such as the fingertip, on tactile motion, or on active touch. It remains unclear whether we should speak of a tactile field, analogous to the visual field, and supporting spatial relations between stimulus locations. Here we investigate this question by studying perception of large-scale tactile spatial patterns on the hand, arm and back. Experiment 1 investigated the relation between perception of tactile patterns and the identification of subsets of those patterns. The results suggest that perception of tactile spatial patterns is based on representing the spatial relations between locations of individual stimuli. Experiment 2 investigated the spatial and temporal organising principles underlying these relations. Experiment 3 showed that tactile pattern perception makes reference to structural representations of the body, such as body parts separated by joints. Experiment 4 found that precision of pattern perception is poorer for tactile patterns that extend across the midline, compared to unilateral patterns. Overall, the results suggest that the human sense of touch involves a tactile field, analogous to the visual field. The tactile field supports computation of spatial relations between individual stimulus locations, and thus underlies tactile pattern perception. Crown Copyright © 2011 Published by Elsevier B.V. All rights reserved.

1. Introduction The spatial quality of touch is often studied, but remains poorly understood. The most studied aspect is tactile localization. This refers to the ability to identify the place where a single tactile stimulus touches the skin. This location can be specified in two ways, skin-space and external-space. Skin-space localization describes stimuli in terms of their position on the flattened receptor surface. External-space localization describes the stimulus by giving coordinates in an egocentric representation. Critically, external-space locations are updated when the body part moves, while skin-space locations are not. Many recent psychophysical studies underline the important role of external space in tactile representation (Spence, Pavani, & Driver, 2004; Maravita, Spence, & Driver, 2003). Tactile localization is often considered in the context of spatial orienting responses, such as eye movements, pointing movements or orienting of spatial attention to the touched location. Indeed, Lotze's (1885) theory of local signs makes orienting the basis of spatial perception. Lotze asked how a signal consisting of neural impulses, which are not intrinsically spatial, could acquire spatial phenomenal qualities. He suggested that the association between a specific tactile stimulus (e.g., a bump on the head) and an orienting response (reaching ⁎ Corresponding author at: Institute of Cognitive Neuroscience, 17 Queen Square, London WC1N 3AR, UK. Tel.: +44 207 6791153. E-mail address: [email protected] (P. Haggard).

out to rub the wound) would lead to a spatial percept if sufficiently reinforced. On this view, the spatial quality of touch is not intrinsic, but derives from the spatial quality of action. Lotze's approach effectively considers touch as a single target point for an immediate, aimed movement. However, a second spatial aspect of touch, which we call spatial pattern perception, seems to require a quite different kind of spatial representation. Spatial pattern perception involves perceiving the angles, distances and forms created by multiple tactile stimuli. For example, judging whether three tactile stimuli are colinear requires representing the spatial relations between the stimuli within a continuous common space, or tactile field. We use this term, by analogy to the ‘visual field’, to mean the “spatial array of … sensations” available to perception (Smythies, 1996). This array has a distinctive spatial organisation because of the arrangement of receptive fields on the receptor surface (skin or retina). The spatial organisation may be progressively modified at various levels of the perceptual pathway according to the function of each processing stage, but the general principle of systematic organisation of neurons according to their receptive fields appears to be preserved. Studies of the visual system suggest that visual field-based organisation underlies pattern and form perception (Wandell, Dumoulin, & Brewer, 2007), motion perception (Kolster et al., 2009), and the binding of local features to form overall patterns (Freeman, Driver, Sagi, & Li, 2003) There is thus a clear link between the concepts of spatial pattern perception, and a perceptual field.

0001-6918/$ – see front matter. Crown Copyright © 2011 Published by Elsevier B.V. All rights reserved. doi:10.1016/j.actpsy.2011.03.001

66

P. Haggard, G. Giovagnoli / Acta Psychologica 137 (2011) 65–75

It remains unclear whether the human sense of touch supports a tactile field organisation with the key quality that characterises visual field-based organisation, namely a systematic organisation of receptive fields capable of supporting representation of spatial patterns. Our approach here is to look for evidence for representation of the spatial relations between individual touch locations, resulting in perception of a spatial pattern as a whole. If this evidence is convincing, we can conclude that a tactile field does exist, and we can investigate its limits and underlying mechanisms. Previous work on tactile pattern perception generally has one or more of the following three properties: (1) spatial patterns are confined to a local skin region (Manning & Tremblay, 2006; Van Boven, Ingeholm, Beauchamp, Bikle, & Ungerleider, 2005), (2) spatial quality depends on the motion or temporal succession of tactile stimuli, and (3) spatial quality depends on haptic exploration of a stimulus (“active touch”). These features may also contribute to the spatial quality of touch, but, crucially, they do not of themselves imply a true tactile-field organisation. For example, many previous studies of tactile pattern perception have focussed on restricted areas of very high acuity skin, such as the fingertip or tongue (Henkin & Banks 1967; Craig & Qian, 1997; Sathian & Zangaladze, 2001; Van Boven et al., 2005). In these small regions, it is difficult to distinguish perception of an overall spatial pattern from local interactions between immediately adjacent stimuli. Local interactions may reflect lateral inhibition (Haggard, Christakou, & Serino, 2007), or integration of information from overlapping receptive fields (Loomis, 1979), neither of which require explicit representation of spatial patterns. In the visual system, large-scale lateral interactions (Freeman et al., 2003) are thought to involve a different mechanism from local edge detection provided by on-centre off-surround receptive field organisation, By analogy, tactile field organisation should involve spatial patterns which are too large to be explained by local interaction mechanisms. Second, tactile spatial patterns have often been studied using moving, or apparently moving stimuli, as in the Cutaneous Rabbit Effect (CRE: Geldard & Sherrick, 1972). A temporal succession of individual tactile locations then produces a spatial percept. However, the underlying relation between the stimuli appears to be temporal rather than spatial: CRE studies show that the perceived position of a tap depends strongly on the time at which it is delivered. For this reason, we suggest that evidence for a tactile field organisation should avoid tactile motion perception. Third, many studies emphasise the contribution of active touch to spatial pattern perception. However, the spatial quality of active touch depends on proprioception as much as on touch. For example, when we haptically explore an object and perceive metric properties such as object shape (Klatzky & Lederman, 2003; Lederman & Klatzky 2004; Reed, Klatzky, & Halgren, 2005), the spatial percept depends on proprioceptive information about body movement, while tactile information may be confined to a single point such as the fingertip. In Molyneux's example (Evans, 1985), one might identify a cube by tracing its edges with a fingertip. The resulting tactile sensations might have almost no spatial variation across the skin at all. Finally, previous work suggests that the ability to represent and orient to multiple tactile stimuli presented in parallel is poor. Gallace and Spence (2007) showed that subjects are effectively unable to perceive more than 1 touch at a time, though Riggs et al. (2006) give a more liberal estimate of 3 stimuli. This limited capacity of touch might suggest a difficulty in representing the spatial relations between multiple stimuli, and therefore a difficulty with spatial pattern perception. Therefore, it remains unclear whether a tactile field exists, how it is spatially organised, and how it is used in spatial pattern perception. To address these issues we should first identify what might count as evidence for a tactile field. We suggest that the ability to represent the spatial relations between multiple tactile stimuli is a sufficient condition for a tactile field. Interestingly, the representation of spatial relations between stimuli can be distinguished from representing the spatial location of a single stimulus. An interesting example of this

dissociation is the finding of hyperacuity in vision (Westheimer & McKee, 1977), and in touch (Loomis, 1979). Hyperacuity refers to instances of sensory spatial resolution greater than would be expected from the receptive field size of individual receptors. For example, Loomis (1979) calculated tactile Vernier acuity thresholds by asking participants to judge the alignment of two tactile bars presented to the fingertip. These thresholds averaged just 15% of the two-point discrimination threshold. The improvement was attributed to representation of the spatial pattern formed by the two aligned bars, i.e., of the spatial relations between stimulus elements, in addition to the spatial location of each element individually. Loomis argued that the organisation of the receptive fields in the skin played a key role in computing these spatial relations. Since receptive fields on the fingertip overlap, the relative levels of excitation of several neurons within a local population can be used to represent spatial locations more accurately than the width of any single receptive field would imply. However, there appears to be a second situation where hyperacuity effects can arise, but which cannot be explained by local integration of inputs. These are cases involving stimulus patterns extending over skin regions larger than the degree of local RF overlap. Large-scale pattern effects are known to occur in vision (Khoe, Freeman, Woldorff, & Mangun, 2004), and involve attentional binding of information from different parts of the receptor surface (Freeman et al., 2003), but have not previously been studied in touch. In a previous study (Serino, Giovagnoli, deVignemont, & Haggard, 2008), we investigated whether two simultaneous brief flanker touches on the palm of the hand influenced perception of a subsequent target touch in a spatially dependent way. We found that judgments of target intensity were more strongly affected by flanker intensity when the target lay on the line joining the flankers, than when the target was distant from this line. This result suggests that collinear tactile stimuli are an important test case for tactile field representation, that such stimuli are integrated to form an overall spatial pattern, and that pattern formation results in fusion between elements, causing loss of information about individual stimuli within the pattern. While the above study suggests a field-based organization may exist, it does not reveal its spatial properties, other than showing the importance of colinearity. Here we use explicit judgments about the spatial relation of a tactile target stimulus relative to a line joining two other tactile stimuli to investigate the spatial factors that influence pattern perception in touch. To investigate the pure spatiality of touch, without additional contributions from proprioception, we have focussed on large-scale judgements on areas of relatively low acuity, in the absence of body movement. Experiment 1 showed that the ability to perceive a tactile pattern defined by several stimuli is as good as might be expected from the perception of subsets of the pattern. Experiment 2 reveals the spatial extent and temporal persistence of the tactile field evoked in a tactile line-judgement task. Experiment 3 shows a relative deterioration in tactile line-judgement for tactile patterns which bridge two body parts. Experiment 4 showed a drop in tactile line-judgement for tactile patterns which bridge both sides of the body, and thus both hemispheres. We conclude that a tactile field analogous to the visual field exists, and that touch is therefore a fully spatial sense. Moreover, perception of spatial patterns across the field is linked to a structural representation of one's own body. Perception of a tactile pattern based on the spatial relations between stimuli therefore involves at least a basic element of self-representation. 2. Experiment 1 Eight subjects (mean age 34.75, SD 15.07, 4 female, all right-handed) participated on the basis of written informed consent. Subjects rested the right hand palm downwards on a plasticine mould, in which were embedded ten customised miniature solenoids as shown in Fig. 1A. The operation of the solenoids was invisible and inaudible, but subjects viewed a picture like Fig. 1A, with numerical labels that they could use to

P. Haggard, G. Giovagnoli / Acta Psychologica 137 (2011) 65–75

identify tactile locations. In a target localisation condition, one of the four central solenoids was vibrated, and subjects identified it with an unspeeded verbal response. The vibration was randomly and equiprobably given at 30, 40 or 50 Hz, with 100 or 200 ms duration, and at one of the 4 target solenoid locations. Each combination of location, frequency and duration was presented 8 times, making 192 trials in total. Frequency and duration were irrelevant to the task, but were varied to ensure that participants responded to location rather than to any specific phenomenal properties of each solenoid. In another block, one of the three upper flanker solenoids was vibrated, with the schedule of frequencies, durations and repetitions as before, giving 144 trials. The lower flanker solenoids were similarly tested in another block. In each case, participants were informed before each block which row of solenoids would be tested. Further blocks required subjects to identify a tactile pattern formed by vibration of several solenoids. In a linejudgement block, one upper row solenoid and the corresponding lower row solenoid were vibrated simultaneously, with the same duration and frequency. These vibrations defined 3 parallel lines, shown as vertical dashed lines in Fig. 1A, and participants identified which line they perceived, across 144 trials. Finally, in an alignment-judgement block, the upper and lower flankers were first vibrated to define a line, as described before. After an 800 ms inter-stimulus interval, one of the four central targets was activated at random and equiprobably. Participants judged whether the target was to the left or right of the line defined by the two flankers. Target frequency and duration were chosen at random from the combinations described above, to give a total of 432 trials. Each of these 5 conditions was divided into 2 blocks of equal length. One block in each condition was presented to each subject in a randomised order. After a short break, the remaining blocks were performed in the reversed order.

2.1. Results and discussion The percentage correct judgements in each condition are shown in Table 1. Alignment judgements followed a standard psychometric function, with performance improving as the perpendicular distance from the target to the line joining the two flankers increased (Fig. 1B). Accuracy was well above chance in all conditions. Our interest lay primarily in the alignment-judgement condition differed from the level predicted from performance in localizing the flanker line and in localizing the target solenoid individually. We therefore took as a null

A

67

Table 1 Performance in Experiment 1. Condition Flanker localisation Distal flankers Proximal flankers Inter-flanker line judgement Target localisation Alignment judgement

Chance level (%)

Mean accuracy (%)

Standard deviation (%)

33 33 33

79.6 84.2 82.6

11.1 9.4 11.5

25 50

73.7 86.4

12.7 6.1

hypothesis the hypothesis that alignment judgement simply reflects the operation of independent processes of line localisation and target localisation. On this view, alignment judgement performance, expressed as a probability, should simply be the product of line localisation and target localisation performance. Our experimental hypothesis was that alignment judgement depends on an additional process of representing the overall pattern formed by the flanker line and the target. This hypothesis predicts that alignment performance should be better than predicted from the null independence hypothesis. We therefore compared alignment-judgement accuracy to the product of line-judgement and target localization accuracies. Linejudgement accuracy averaged 82.6%, and was comparable to the localisation of either the proximal or distal flankers alone. Since the lines were formed by covarying the proximal and distal flankers, rather than manipulating them independently, this is unsurprising. Target-judgement accuracy averaged 73.7%. Mean performance in the alignment judgement condition was 86.4%. However, it may be objected that analyses relating accuracy of alignment to accuracy of flanker line and target judgement are too liberal. First, the chance level is higher for alignment judgement (2AFC) than for flanker line judgement (3AFC) or target judgement (4AFC). Instead, one should predict performance in the alignment judgement task from individual patterns of localisation judgements in the flanker and target tasks. For example, a participant might base their alignment judgement on an incorrect perception of the flanker line and/or the target location. However, they might still judge the alignment correctly, if the localization errors for flankers and targets are both small, or if they are consistent in direction. We therefore calculated the probability of each response in each trial of the

B

Flankers 3 cm Targets 3 cm

pr("Right")

1.00 0.80 0.60 0.40 0.20 0.00

-3

-2

-1

0

1

2

3

Distance from Flanker Line (cm)

Flankers 1.1 cm

Fig. 1. A. Arrangement of tactile stimuli in Experiment 1. Note that the stimuli are delivered to the subject's right hand which is placed palm downwards on the table to cover the stimulators. B. Mean performance in alignment-judgement task in Experiment 1. Error bars show standard deviation across participants.

68

P. Haggard, G. Giovagnoli / Acta Psychologica 137 (2011) 65–75

alignment judgement task, given the participant's actual percept for the same flanker locations in the flanker line judgement, and for the same target location in the target judgement tasks. This allowed us to calculate probability that the alignment judgement would be correct if it were based solely on two independent sources of information, i.e., how the participant actually perceived the flanker and target locations on that trial. The predicted alignment performance on this assumption of independent flanker and target analysis was 89.5%. This was numerically superior to actual performance level of 86.4%, though the difference was far from significant (t(7) = 1.076, p = 0.318. 2.2. Discussion To summarise, Experiment 1 showed accurate judgements of the spatial pattern formed by three tactile inputs on the hand. Our alignment judgement task involved three distinct spatial components. First, the two flanker touches had to be integrated to define a line between them. Next, spatial information about the line must be retained over a delay, in this case 800 ms. Finally, the location of the subsequent target touch must be encoded relative to the line between the flankers. Our results suggest that the spatial relation between multiple touches can be represented in the brain. The representation of these spatial relations outlasts the stimulation itself, and admits the integration of additional stimuli when these are presented. In this sense, it is indeed appropriate to speak of a tactile field within which several tactile stimuli may define a spatial pattern. Interestingly, our analyses did not find that spatial pattern perception was superior to perception of individual locations. In fact, we found that alignment judgements were very slightly (and nonsignificantly) worse than would be predicted from the ability to identify the flankers and the target locations independently. This finding goes against the view that tactile spatial patterns are based on grouping of individual stimuli, and come to be perceived as a gestalt. Gestalt perception would imply that the pattern is perceived as a single perceptual unit, over and above the sum of individual stimulations. We found no evidence a tactile gestalt in this sense, although our data do not rule out the possibility. Rather, our data are consistent with the view that a tactile spatial representation is constructed from independent information about the locations of each individual element of the pattern. Perception of individual stimulus locations takes place within a tactile field that supports the additional computation of spatial relations between multiple stimuli. Thus, tactile pattern perception appears to involve an additional processing stage, based on integration of the locations of multiple individual stimuli. In the remaining experiments, we investigate organising principles for tactile pattern perception. We used the classic psychometric Just Noticeable Difference (JND) measure in the alignment-judgement task as a measure of how successfully spatial information about tactile locations are integrated into a single pattern percept. This allowed us to compare tactile spatial pattern perception across various conditions. We also varied the body part stimulated, both to increase generality, and to allow us to test specific hypotheses about somatic organisation.

flanker solenoids. A similar brief tap from one of the 6 target solenoids followed either 50 or 800 ms later. The inter-stimulus interval and the location of the target solenoid were randomised. Subjects made unspeeded verbal responses to indicate whether the target lay to the left or right of the line joining the flankers. No feedback was given. Since there was only one flanker line in each block, this judgement could, in principle, be based on target location alone, without reference to the flankers. However, Experiment 1 showed a strong performance advantage for alignment judgements over target localisation alone. This makes it likely that judgements were based on grouping the flankers and target into a common pattern, even when only a single flanker pair is presented. The combination of flanker distance and flanker-target interstimulus interval (ISI) defined 4 conditions in a 2 × 2 factorial design. Subjects made 110 judgements in each condition. Of these, 10 came from the extreme left and 10 from the extreme right stimulus, 20 each from the middle left and right stimuli and 50 from the most central left and right stimuli.

3.2. Results and discussion The probability of responding ‘right’ was calculated for each target solenoid in each condition, and a logistic regression was used to fit a psychophysical function to each subject's data. Data from a typical subject is shown in Fig. 3A. The r2 values for the fitted psychometric functions varied from 0.934 to 0.998. We did not have any prediction about the effect of the flankers on the overall probability of responding ‘right’, or on the point of subjective equality (PSE), since the flanker line was orthogonal to the left-right direction that subjects judged. PSE data were however analysed for illustrative purposes. We found no overall response bias, i.e., the mean PSE (−1.66 mm) did not differ significantly from 0 (t(7) = 0.884, p = 0.406). PSEs were unaffected by flanker distance (F1,7 = 0.086, MSE = 0.913, p = 0.778. However, there was a significant effect of delay, with PSEs shifting to the left (mean −4.05 mm) for 50 ms flanker-target delays relative to 800 ms delays (mean 0.74 mm): F1,7 = 10.781, MSE = 0.1404, p = 0.013. Further, there was an interaction between flanker position and delay, with the effect of delay being strongest for the distant flankers F1,7 = 7.123, MSE = 0.1622, p = 0.032. This

3. Experiment 2 3.1. Methods Eight subjects (mean age 22.25, SD 3.28, 5 female, all righthanded) participated on the basis of informed consent. Subjects sat with the left forearm on a table, palm upwards. Eight customised miniature solenoids were attached to the left forearm as shown in Fig. 2A. The flanker solenoids were repositioned between blocks, so that a row of target solenoids bisected a 5 or 15 cm distance between the flankers. Each trial began with a brief (5 ms) tap from the two

Fig. 2. Arrangement of tactile stimuli in Experiment 2. Flankers are shown as open circles. The target on each trial was chosen from one of the solid circles. A. Flanker separation 5 cm. B. Flanker separation 15 cm.

P. Haggard, G. Giovagnoli / Acta Psychologica 137 (2011) 65–75

B

1

Pr. (“Right”)

0.75

5cm 15cm

0.50

0.25

0 -30

-20

-10

0

10

20

30

Just Noticeable Difference (mm)

A

69

10

5cmFlankers

9

15cmFlankers

8 7 6 5 4 3

Distance from target to flanker line (mm)

50ms

800ms

Flanker-Target Interval

Fig. 3. A. Psychophysical curves for subject 4 in Experiment 2. The r2 values for the fits are both above 0.99. Solid symbols and thick line: 5 cm flanker distance. Open symbols and thin line: 15 cm flanker separation. B. Tactile performance (Just Noticeable Differences, JNDs) in Experiment 2. Error bars show standard deviation across participants.

pattern of results was not specifically predicted, but could merit further investigation. In contrast, we hypothesised that better integration of location information into a pattern percept would improve the precision of judgements of target location relative to the flanker line, and thus increase the slope of the psychophysical function. Accordingly, we used the semi-interquartile range of the psychophysical function as an index of the Just Noticeable Difference (JND) in tactile metric judgement (Fig. 3B). The mean JND was 6.74 mm. Repeated measures ANOVA showed a highly significant main effect of inter-stimulus interval (F1,7 = 31.685, MSE 1.987, p = 0.001) on JNDs, with precision of tactile metric judgement increasing as ISI increased. The effect of flanker-target distance was small, and far from significant (F1,7 = 0.678, MSE 1.812, p = 0.437), and there was no interaction between the two factors (F1,7 = 1.724, MSE 3.5379, p = 0.231). Experiment 2 revealed three important features of tactile metric judgement. First, the ability to judge the spatial relation between flankers and target was surprisingly good. In particular, the just noticeable difference in alignment judgements was well below the two-point discrimination threshold (2PDT) on the forearm (given as 40 mm by Weinstein, 1968). While the 2PDT has been criticised as a measure of tactile acuity, our data suggest that judgements of the alignment of several stimuli are possible well below the acuity threshold. In vision, Vernier acuity typically exceeds grating acuity (Westheimer & Hauske, 1975). Such findings have been interpreted as evidence for a second stage of spatial representation limited by cortical organisation, rather than receptor density (Levi & Klein, 1985). Second, tactile alignment judgements were barely affected by flanker separation, with only a nonsignificant 0.4 mm increase in JND as flanker separation increased from 5 to 15 cm. That is, a 200% increase in the size of a spatial pattern produced less than a 10% increase in JND. Multiple tactile stimuli can thus be integrated over an extensive area to generate a tactile spatial pattern. This suggests that tactile field organisation is not based on the localised, spatially limited organisation of tactile receptive fields in primary somatosensory cortex. Third, alignment judgements improved strongly as the flankertarget ISI increased from 50 to 800 ms. This might reflect the time required for the brain to elaborate a secondary field-based representation after primary processing of individual stimuli. Studies of other spatial somatosensory tasks, such as haptic matching, and tactile remapping suggest a shift from limb-centred to an external spatial frame of reference in the time period immediately after stimulation

(Zuidhoek, Kappers, van der Lubbe, & Postma, 2003; Voisin, Michaud, & Chapman, 2005; Kaas, van Mier, & Goebel, 2007; Azanon, Longo, Soto-Faraco, & Haggard, 2010). Elaborating a field-based representation may involve similar, time-consuming processes as transformation between one frame of reference and another. Our increase in performance with ISI also rules out any contribution from tactile motion. Studies of tactile apparent motion (Geldard & Sherrick, 1972) show that illusions of motion decrease with longer ISIs, whereas the spatial judgements studied here improved at longer ISIs. In addition, the net motion vector on each trial of our design is zero. Alternatively, the time-dependence we observed could arise artefactually, if the initial flanker stimuli had simply masked the target. However, this possibility seems unlikely, since Laskin and Spencer (1979a) observed that a first tactile stimulus had no effect on the neural activity evoked by a second stimulus at ISIs over 50 ms. Additionally, tactile masking depends strongly on spatial proximity (Laskin & Spencer 1979b), yet we found no evidence of an interaction between ISI and flanker separation. Overall, then, the results are consistent with the existence of a field-based spatial representation in touch, elaborated by secondary processes in the somatosensory system that integrate individual tactile locations to form a pattern percept. 4. Experiment 3 Experiment 3 aimed to investigate the nature of the spatial representation underlying the tactile field. Whereas the skin forms a continuous receptor surface covering the body, cognitive levels of body representation segment the body into discrete parts (hands, arms etc.), based on their motor, physical or structural properties (de Vignemont, Tsakiris, & Haggard, 2005). Body part structure imposes a sort of categorical perception on tactile sensation. For example, tactile distances that span two body parts are overestimated relative to identical distances within a single body part (de Vignemont, Majid, Jola, & Haggard, 2008). In particular, joints appear to form landmarks which segment the body into parts. In contrast, tactile localisation ability in the longitudinal direction is reliably better over joints than in the centre of body segments (Cholewiak & Collins, 2003; Flach & Haggard, 2006), as is tactile acuity. In the transverse direction, acuity seems independent of skin site (Cody, Garside, Lloyd, & Poliakoff, 2008). Therefore, body part structure appears to have effects on integration of tactile stimuli that are not present in the coding of tactile location. Dependence on body structure is therefore a characteristic feature of large-scale tactile field organisation, and

70

P. Haggard, G. Giovagnoli / Acta Psychologica 137 (2011) 65–75

may be an important principle of this level of tactile spatial representation. Experiment 3 therefore compared tactile alignment judgements for tactile patterns within a single body part (palm or forearm) and for tactile patterns spanning two body parts across the intermediate wrist joint (Fig. 4A). Our interest focussed on how alignment judgements varied between the different skin regions tested. Two quite different models can be hypothesised. First, a local acuity model would suggest that alignment judgements depend only on acuity at the flanker and target locations. This model predicts better performance in the wrist than in hand/arm conditions, since localisation and acuity are better at and near joints than elsewhere. However, this model may be less appealing for the particular stimulus arrangements that we used, since the superior localisation at joints appears to be confined to the longitudinal axis, and is absent in the transverse axis on which our alignment judgements depended (Cody et al., 2008). Second, a bodystructural integration model, would suggest that alignment judgements would be worse when the tactile pattern is distributed over two body parts than when it falls within a single body part. In addition, both models must respect the general principle that quality of tactile representation decreases in a distal-proximal direction across the skin because of mechanoreceptor density: tactile acuity is worse on the forearm than on the palm (Weinstein, 1968). This distal-proximal gradient means that performance on the forearm is expected to be poor, for low-level reasons of receptor density. This modulation is quite independent of the effects suggested by the body-structural integration model. To adjust for this low-level effect, we assumed a linear improvement in tactile information in a proximal-to-distal direction. Since the wrist was equidistant between the hand and forearm stimulated locations, averaging the performance at the palm and forearm sites provides an estimate of the wrist performance under a null hypothesis. Therefore, in addition to overall ANOVA, we performed specific planned tests to investigate the predictions of the body-structural integration model. Compare the wrist performance to this null model. The local acuity model predicts that performance at the wrist will be equal to the average of palm and forearm, while the body-structural integration model predicts worse performance at the wrist than at the average of palm and forearm. Because the bodystructural integration model gives a clear directional prediction, onetailed tests were used where appropriate.

4.1. Methods The methods were similar to Experiment 2. The distance between flankers was fixed at 5 cm, and the flanker-target interval was 250 ms.

17 new subjects (mean age 29.0, SD 10.88, 7 female, all righthanded) participated in Experiment 3. However, data from three subjects had to be discarded because their data did not show a monotonic psychophysical function in at least one condition (two of these subjects also spontaneously reported being unable to perceive the spatial patterns). Subjects sat with their arm completely still, and an array of solenoids embedded in a mould was placed on the medial aspect of the left hand (palm) and forearm. Subjects performed 3 blocked conditions in counterbalanced order, though exclusion of subjects means that counterbalancing in the final dataset is only approximate. The solenoid array was repositioned between blocks. Thus, flankers and target were all located on the palm in one block, and on the forearm in another (Fig. 4A). In the critical wrist block, the targets were located over the wrist joint, with one flanker each on the hand and arm, at the locations used by flankers in hand and arm blocks respectively. Therefore, the entire set of stimulated locations was symmetrical about the wrist joint.

4.2. Results The r2 values for the fitted psychometric functions varied from 0.932 to 1.0, comparable to other tactile localisation tasks involving the same body parts (Cody et al., 2008). JNDs were 3.85 mm (SE 0.367) on the palm, 5.93 mm (SE 0.746) across the wrist, and 5.42 mm (SE 0.569) on the arm respectively (Fig. 4B). Overall ANOVA showed a significant effect of body part (F(2,26) = 6.971, MSE 0.0195, p = 0.004). To identify the basis of this difference, we performed follow-up t-tests between the different conditions. These showed significant differences between palm and wrist, and between palm and forearm (p = 0.002, 0.007 respectively). The difference between wrist and forearm was not significant (p = 0.477), as might be expected from the distal-proximal gradient in mechanoreceptor innervation. Because we wanted to test a specific theory about the role of body representation in spatial pattern judgement, we also performed additional planned contrasts. Specifically, the body-structural integration model predicts worse performance at the wrist compared to the underlying distal-proximal gradient defined by mechanoreceptor density. Because the prediction is directional, a one-tailed test was used. We therefore compared the JND at the wrist to the average of JNDs for the palm and forearm, using coefficients of −0.5 for the palm, 1 for the wrist, and −0.5 for the forearm. The contrast was significant (t(13) = 2.27, p = 0.021, one-tailed), showing that tactile alignment judgement was significantly worse for flankers spanning the wrist joint than for the average of stimuli contained entirely within the

Fig. 4. A. Arrangement of tactile stimuli on the palm, forearm, and spanning the palm and forearm across the wrist, in Experiment 3. B. JNDs in Experiment 3. Error bars show standard deviation across participants.

P. Haggard, G. Giovagnoli / Acta Psychologica 137 (2011) 65–75

hand or within the arm. This contrast explained 36% of the total variance due to the body part factor in the overall ANOVA. 4.3. Discussion Experiment 3 found that tactile alignment judgements spanning two body parts were poor relative to the average performance within each of the adjoining body parts. This conclusion is not without some underlying assumptions. Tactile information processing is generally better on more distal than more proximal body parts. We assumed an overall linear gradient of tactile information in the distal-proximal axis, which is consistent with the available data (Weinstein, 1968). A null hypothesis might then state that alignment judgements at the wrist would correspond to the average of palm and forearm performance. One alternative hypothesis predicts better alignment performance at the wrist than the average of palm and forearm, because of the locally high acuity at the wrist. In fact, previously published data shows that acuity at the wrist is superior to the average of palm and forearm acuity in the longitudinal axis, but similar to the average of palm and forearm acuity in the transverse axis (Cody et al., 2008). In any case, our alignment judgements showed poorer performance at the wrist than the average of palm and forearm. These results suggest that the structure of the hypothesised tactile field depends not only on the continuous receptor sheet of the skin, but may depend on underlying structural features of the body. The tactile field is not defined only in skin-space, but also reflects the segregation of the body into parts, at least at the wrist. This effect cannot be due to differences in skin innervation, target localisation or tactile acuity alone, since several studies confirm that target localisation is as good, or better over a joint than in the middle of the adjacent body parts (Cholewiak & Collins, 2003; Flach & Haggard, 2006; Cody et al., 2008), yet we found worse alignment judgements at the wrist. Presumably, therefore, if our subjects had simply ignored the flankers, and attempted to localise the target stimulus, they would have performed as well in the wrist condition as the average of the palm and forearm conditions, but in fact they performed worst in the wrist condition. This suggests that they integrated the target into a spatial pattern defined by the flankers, even though this integration was detrimental for performance. Further, this integration process operates better within a single body part than across body parts. A similar body-part-based modulation of tactile distance perception was found by de Vignemont et al. (2008). They found a relative overestimation of distance between two tactile stimuli spanning a joint, relative to similar distances confined to the adjacent body parts. Both results are consistent with a discontinuity at body part boundaries in a field-based organisation used to integrate touch into an overall spatial pattern. The tactile field may thus depend on a mental representation of large-scale body structure, as well as on the spatial information provided by tactile receptors in each skin region. Our conclusions are based only on data from the wrist joint. Future research would clearly be required to show whether these effects generalise to other joints also.

71

midline compared to tactile patterns that do not, one might conclude that a tactile field organisation is present in early cortical areas represented within a single cerebral hemisphere. Experiment therefore compared alignment judgements for tactile patterns spanning the body midline to the average of identical patterns presented to only the left or right side of the body. 5.1. Methods Experiment 4 compared alignment judgements spanning the midline on the lower back, with judgements confined to the left back or right back only. This skin region was chosen as having a suitably continuous expanse of skin spanning the midline. A grid of points was marked in pen on the participant's lower back as shown in Fig. 5A. The flankers were positioned 8 cm apart, and were delivered on the left or right back, or spanning the midline, in separate blocks in counterbalanced order. In addition, flankers were delivered at one of 3 vertical levels at random: upper, centre or lower. The spacing between vertical levels was 1 cm. The flanker stimuli were delivered manually using Vernier calipers with a soft 4 mm diameter rubber tip on each point. The target was a single tactile point delivered approximately 1 s after the flankers, using a single tip on the opposite side of the caliper. The 3 vertical rows of flankers and 3 vertical target locations were varied independently, so that each of 9 possible patterns could be presented. Each pattern was presented 8 times, giving a total of 72 trials per blocked condition. Trial order was randomised anew for each condition and each subject. For present purposes, the vertical distance between target and flankers is the key variable influencing subjects' responses. The target could lie on the line between the flankers, or 1 or 2 cm above or below. Of the 72 trials in each blocked condition, 8 were 2 cm above and 8 were 2 cm below the flanker line, 16 were 1 cm above and 16 were 1 cm below the line, while 24 were on the line itself. A psychophysical function was fitted to each subject's data in each condition to calculate the probability of responding ‘above’ as a function of the target-flanker vertical distance. JNDs were calculated as before. As in Experiment 3, we wished to assess whether any differences between sites in tactile pattern perception might be due to differences in tactile acuity. However, unlike Experiment 3, no published tactile acuity values are available for the lower back. Therefore, we also measured each subject's two point discrimination threshold (2PDT) at the central left, central right and central midline target locations. These locations were tested using a simple staircase with occasional interleaved dummy single tap trials (Kennett, Taylor-Clarke, & Haggard, 2001). The initial 2 point distance was 6 cm. This was reduced in 1 cm steps until subjects reported feeling a single point (first reversal), then increased in 5 mm steps until two points were felt (second reversal), and finally decreased in 2.5 mm steps until 1 point was felt (third reversal). The 2 point distance at the third reversal was taken as an estimate of acuity. The three locations were tested in a counterbalanced order after the main experiment. 18 subjects (mean age 25.3, SD 12.5, 6 female, 15 right-handed) participated on the basis of informed consent in return for class credit.

5. Experiment 4

5.2. Results

This experiment again addressed the type of spatial representation used for alignment judgements, but at a neural rather than structural level. A key principle of somatosensory processing is that early somatosensory cortical areas are strongly lateralised, and have welldefined receptive fields (Penfield & Rasmussen, 1950; Blankenburg, Ruben, Meyer, Schwiemann, & Villringer, 2003; Brown, Koerber, & Millecchia, 2004; Huang & Sereno, 2007). Other somatosensory representations, notably in SII, have larger, bilateral receptive fields (Fitzgerald, Lane, Thakur, & Hsiao, 2006). Therefore, if alignment judgements are impaired for tactile patterns that span the body

2 subjects were excluded. For one subject, the psychophysical function in the right back condition did not pass through the 25th percentile, so the semi-interquartile range could not be calculated. For another subject, the fitting of the psychophysical function failed to converge in the left back condition, because the transition from 0% to 100% ‘above’ judgements was too abrupt, and no JND estimate could be calculated. The r2 values for the remaining 16 subjects varied from 0.920 to 0.994. The JND data is shown in Fig. 5B. JNDs were poorer for flankers spanning the midline than for flankers confined to just one side of the

72

P. Haggard, G. Giovagnoli / Acta Psychologica 137 (2011) 65–75

Fig. 5. A. Arrangement of tactile stimuli in Experiment 4. B. JNDs in Experiment 4. Error bars show standard deviation across participants.

back. Overall ANOVA showed a significant effect of body site (F(2,30) 3.955, MSE = p = 0.0303, p = 0.030). To identify the basis of this difference, we performed follow-up t-tests between the different conditions. These showed that JNDs on the left back were significantly different from on the midline (p = 0.007). There was an unexpected trend towards the left back performance being different from the right back (p = 0.076). The data for the midline and the right back did not differ significantly (p = 0.310). Because we wanted to test a specific theory about lateralisation in spatial pattern judgement, we also performed additional planned contrasts. Specifically, we investigated whether perception of tactile patterns spanning the midline was impaired relative to what might be expected from performance on the left and right sides. To do this, we compared JNDs across the midline with the average of those for each side of the body, using coefficients of − 0.5, 1, and − 0.5 for the left, centre and right back respectively. Because we predicted an impairment, rather than a benefit, for patterns spanning the midline, we used a one-tailed test. This hypothesis of impaired tactile pattern perception across the midline was supported (t(15) = 2.03, p = 0.03, one-tailed). The variance explained by this contrast amounted to 54% of the variance between the body sites in the overall ANOVA. The 2PDT values were 29.8 mm (SE 2.65), 29.2 mm (3.03), and 28.9 mm (3.01) for the left, centre and right back respectively. To investigate whether the JND effect could be explained by poorer acuity, and thus poorer target localisation over the midline, we performed a similar planned contrast on the 2PDT data. This showed that tactile acuity at the midline was no worse than the average of the left and right sides: t(15) = 0.057, p = 0.956). This suggests that acuity differences cannot explain the effects on alignment.

part of our original reason for performing the study, may reflect the specialisation of the right hemisphere for spatial processing. Previous studies of tactile patterns have focussed on sequences, notably the cutaneous rabbit illusion (Geldard & Sherrick, 1972; Flach & Haggard, 2006). In this illusion, a sequence of 3 discrete taps appropriately timed and appropriately spaced along a line may feel like a relatively continuous motion along the skin. This spatiotemporal organisation appears to take place in SI, since Blankenburg, Ruff, Deichmann, Rees, and Driver (2006) observed an fMRI activation at the SI location corresponding to the body location of an entirely illusory tap. On the other hand, Eimer, Forster, and Vibell (2005) reported this illusion for sequences that crossed the midline, despite earlier reports to the contrary (Shore, Hall, & Klein, 1998). Our stimulus patterns were chosen to involve no net motion on the skin. Moreover, the time interval between flankers and targets was longer than the 20–250 ms range for the CRE. Therefore, the field organisation studied here is essentially spatial, and distinct from the spatiotemporal sequence organisation of CRE. Nevertheless, like CRE, tactile field effects seem to be unilateral in nature, but capable of bilateral transfer at some performance cost. Finally, Experiment 4, like Experiment 1, used randomised flanker location. In Experiments 2 and 3, subjects could in principle base their judgements on target localisation alone, without the need to integrate flankers and target into a common field-based spatial pattern. However, Experiment 4, like Experiment 1, randomised flanker location across trials, forcing subjects to judge the position of each target relative to the immediately preceding flankers. In this case, then, subjects must relate the target to the flankers to perform successfully.

5.3. Discussion

6. General discussion

We found a relatively small (18%), but statistically reliable, deterioration in alignment judgement performance for tactile patterns spanning the body midline, compared to lateralised patterns. This result suggests that tactile field organisation arises in a lateralised representation of the body, such as the early somatosensory cortex (SI), or the tactile areas of the intraparietal sulcus (Huang & Sereno, 2007). When two flankers are positioned either side of the body midline, an additional process of interhemispheric integration may be necessary to establish the tactile pattern, resulting in a performance decrement. We also found that alignment judgements tended to be superior on the left back to the right back. This result, which was not

This study investigated perception of tactile spatial patterns over relatively large skin regions. We used tactile alignment judgements to investigate how several tactile stimuli are integrated into a tactile pattern. We hypothesised that perceiving such patterns involves representing the spatial relations between tactile locations. Representing spatial relations between stimuli involves additional levels of organisation, beyond the capacity to localise an individual stimulus. We used the term ‘tactile field’ to refer to such representations. Experiment 1 showed that people are able to integrate a tactile pattern percept from the location of individual tactile elements on the skin. Experiment 2 showed that alignment judgements are relatively

P. Haggard, G. Giovagnoli / Acta Psychologica 137 (2011) 65–75

unaffected by the spatial extent over which integration occurs, and improve rather than deteriorate with the time available to integrate stimuli into a spatial pattern, at least up to 800 ms. Experiment 3 showed that alignment judgements depend not only on the spatial relation between stimulus elements in external space, but also vary as a function of body structure. In particular, judgements about the alignment of tactile patterns spanning two distinct body parts were surprisingly poor. Finally, Experiment 4 showed a modest but significant deterioration in judgements requiring integration of stimulus elements across the body midline. Taken as a whole, these studies provide good evidence for the existence of a tactile field supporting perception of tactile spatial patterns. That is, tactile processing includes the capacity to represent spatial relations between locations of individual stimuli over relatively large distances. We showed that tactile pattern judgements depend on secondary factors over and above local tactile perceptual ability at the stimulated locations. These factors included the body parts stimulated (Experiment 2), the time window over which the different stimuli comprising the tactile pattern are delivered (Experiment 3), and the hemisphere(s) receiving tactile information (Experiment 4). Because the bandwidth of human tactile perception is low, and tactile attention is highly focal (Gallace & Spence, 2007), it may seem as though tactile stimuli are perceived only singly, and that integration of several stimuli into a spatial pattern is limited. Our results show that the basic field-like representations required to perceive a spatial pattern are present: touch is clearly a spatial sense. Our data also clarify the perceptual process of integrating several individual stimuli into a single spatial pattern on the basis of their locations. In the present case, the two flankers must first be integrated to form a line. Second, the line must be represented in a way that allows interpolation, so that the target location can be compared to an unstimulated intermediate point along the line between the flankers. The direction of the error signal arising from this comparison can be used to judge tactile alignment. This type of spatial processing therefore involves specific perceptual processes over and above localisation of a single point for oriented responding. For example, simply representing the “local signs” of the individual flankers and target would not suffice to interpolate between them, nor to calculate the directional error between the local sign of the target and the interpolated point. Instead, our results suggest a distinct process that computes the metric spatial relations between individual tactile stimuli in order to represent a spatial pattern. These spatial relations cover large skin regions, and cannot readily be explained by local receptive field mechanisms. For the same reasons, alignment judgement is unlikely to depend only on the disproportionate somatotopic map of primary somatosensory cortex (SI). While SI may be an optimal representation for acuity, it is a poor representation for integration, interpolation and comparison of several stimuli, because of the enormous differences in primary neural response evoked by identical stimuli applied on different skin regions. In fact, representations of SI require rescaling to reflect the true size of body parts, in order to extract metric properties of touch (Taylor-Clarke, Jacobsen, & Haggard, 2004). We therefore postulate a secondary tactile representation of the body surface, capable of supporting integration, interpolation and comparison functions, and forming the basis of the tactile field. There are interesting analogues with the visual system: early visual cortex shows a strong foveal overrepresentation, while secondary area V6 shows a wide-field representation without foveal magnification. Experiment 4 suggests that this secondary representation is a map of the contralateral body, and not just a database of semantic knowledge about body parts. This secondary tactile map would require not only invariant topology, but also a metric structure, capable of supporting interpolation, and proportional distance. Previous discussion of secondary somatosensory representations (Longo, Azanon, & Haggard, 2010) focused on perception of the body, for example in determining the perceived position of individual body parts in space (Longo & Haggard, 2010). Our results show, for the

73

first time, a representation of spatial relations between multiple tactile locations in a map that explicitly refers only to the skin, but implicitly refers to body structure. The neural basis of large-scale tactile judgement has rarely been studied. However, Spitoni, Galati, Antonucci, Haggard, and Pizzamiglio (2010) recently reported a brain region specialised for representing the distance between two tactile stimuli. They found that a region in the right angular gyrus was more activated when participants judged whether a tactile distance presented to the right arm was greater or smaller than another distance presented to the right thigh, relative to judgements comparing the intensity of the same stimuli. Representation of distance between locations is one component of spatial pattern perception, though our alignment judgement task may involve other components. Further, the lateralisation of the area reported by Spitoni et al. remains unclear, since only the right side of the body was stimulated in their experiment. Some previous theories have suggested that the apparent spatial aspect of touch in fact derives from other sensory mechanisms. For example, Martin (1992) and O'Shaughnessy (1989) suggest that touch does not have a field-like spatial organisation: the apparent spatiality of touch, such as the impression of a spherical shape when holding a tennis ball in the hand, would in fact come from proprioceptive information about the spatial configuration of the hand, and not from tactile contacts on the skin. Our results suggest do not support this view. We showed evidence for large-scale spatial organisation following passive tactile stimulation of static body parts. A field-type organisation was present even on body parts such as the back, which have relatively fixed configuration and lack the dynamic proprioceptive updating associated with limb movement (Ghez, Gordon, Ghilardi, Christakos, & Cooper, 1990). Experiment 3 showed that representation of body structure also influenced tactile alignment judgement. A cognitive representation of body structure appears to be relevant to tactile pattern perception, but the tactile field itself seems to be independent of proprioception. Several studies have suggested that tactile pattern perception involves visualisation. This view receives strong support from the finding that visual brain areas are activated during tactile spatial judgement tasks (Sadato et al., 1996; Sathian & Zangaladze, 2002; Prather, Votaw, & Sathian, 2004). If tactile information is immediately translated into visual codes, then the spatial organisation apparent in our results might come from a visual field, rather than a tactile field. Indeed, recent studies have demonstrated long-range interactions in vision quite similar to those we have found in touch. For example, the contrast level required to detect a grating is reduced when the grating is shown between two widely spaced suprathreshold gratings of the same orientation (Khoe et al., 2004). Nevertheless, two features of our results suggest that the spatial quality of tactile patterns is not simply due to translation into visual codes. First, a visual translation hypothesis cannot explain why underlying structural features of the body, such as the wrist joint in Experiment 3, influence the tactile field. Clearly, the wrist joint did not serve as a landmark to improve visual representations translated from the tactile pattern, since that would have predicted improved rather than impaired performance. Second, visual translation seems less plausible for tactile patterns on body parts that are never seen, such as the small of the back (Experiment 4, compare with Tipper et al., 2001). Nevertheless, our data cannot conclusively rule out an explanation of tactile spatial pattern perception based on visual recoding. Finally, we suggest that tactile pattern perception involves an important yet overlooked aspect of self-representation. Metric spatial judgement about the relations between multiple locations on the receptor surface requires some knowledge about the receptor surface. To give one widely cited example, judging the temporal order of tactile stimuli delivered to each hand depends on proprioceptive knowledge of where the hands (Shore, Spry, & Spence, 2002), or indeed the tip of a hand-held tool (Yamamoto & Kitazawa, 2001), are in space. Our results suggest that knowledge about the configuration of the skin relative to the underlying body is used for tactile pattern

74

P. Haggard, G. Giovagnoli / Acta Psychologica 137 (2011) 65–75

perception, even when concurrent proprioceptive information is not required. Tactile metric perception thus seems to refer to a representation of body structure. Experiment 4 suggested that this representation arises at least partly in lateralised somatosensory areas. The role of body representation in mediating tactile pattern perception offers a new insight into the classic problem of the relation between primary experience and self-consciousness (Bermudez, 1998). Most previous theoretical work has focussed on a basic form of self-representation implicitly present in the visual experience of an animal making orienting actions in egocentric space (Evans, 1983). On this view, the self is effectively a dimensionless point at the origin of a representation of the immediate environment. In contrast, our studies of touch suggest that a substantive representation of one's own body as a volumetric object mediates spatial judgements on the body surface. Tactile pattern perception involves representing oneself both as a source of sensory experience, but also as a physical object with a characteristic body structure, and therefore having spatial attributes analogous to other objects. In touch, then, the linkage between primary experience and self-consciousness seems stronger than in vision. This linkage shows that the body is a physical as well as a psychological object. In this sense, tactile pattern perception presupposes a self that is an object embedded in the world, rather than simply a viewpoint on the world (Merleau-Ponty, 1962). Acknowledgements This research was supported by grants from British Academy and BBSRC to PH. GG was additionally supported by a bursary from the University of Bologna Joint International Cognitive Neuroscience PhD Program. We are grateful to Frederique de Vignemont, Matthew Longo and Andrea Serino for advice and pilot work, to Mirandola Gonzaga for help with data collection for Experiment 1, and to Michael Taylor for help with data collection for Experiment 4. References Azanon, E., Longo, M. R., Soto-Faraco, S., & Haggard, P. (2010). The posterior parietal cortex remaps touch into external space. Current Biology, 20, 1304−1309. Bermudez, J. L. (1998). The paradox of self-consciousness.: MIT Press Representation & Mind Series. Blankenburg, F., Ruben, J., Meyer, R., Schwiemann, J., & Villringer, A. (2003). Evidence for a rostral-to-caudal somatotopic organization in human primary somatosensory cortex with mirror-reversal in areas 3b and 1. Cerebral Cortex, 13(9), 987−993. Blankenburg, F., Ruff, C. C., Deichmann, R., Rees, G., & Driver, J. (2006). The cutaneous rabbit illusion affects human primary sensory cortex somatotopically. PLoS Biology, 4(3), e69. Brown, P. B., Koerber, H. R., & Millecchia, R. (2004). From innervation density to tactile acuity: 1. Spatial representation. Brain Research, 1011(1), 14−32. Cholewiak, R. W., & Collins, A. A. (2003). Vibrotactile localization on the arm: Effects of place, space, and age. Perception & Psychophysics, 65(7), 1058−1077. Cody, F. W., Garside, R. A., Lloyd, D., & Poliakoff, E. (2008). Tactile spatial acuity varies with site and axis in the human upper limb. Neuroscience Letters, 433, 103−108. Craig, J. C., & Qian, X. (1997). Tactile pattern perception by two fingers: Temporal interference and response competition. Perception & Psychophysics, 59(2), 252−265. de Vignemont, F., Majid, A., Jola, C., & Haggard, P. (2008). Segmenting the body into parts: Evidence from biases in tactile perception. The Quarterly Journal of Experimental Psychology, 62, 500−512. de Vignemont, F., Tsakiris, M., & Haggard, P. (2005). Body mereology. In G. Knoblich, I. Thornton, & H. Grosjeau (Eds.), Human body perception from the inside out (pp. 147−170). Oxford: Oxford University Press. Eimer, M., Forster, B., & Vibell, J. (2005). Cutaneous saltation within and across arms: A new measure of the saltation illusion in somatosensation. Perception & Psychophysics, 67(3), 458−468. Evans, G. (1983). The varieties of reference. Oxford: Oxford University Press. Evans, G. (1985). Molyneux's question. In G. Evans (Ed.), The collected papers of Gareth Evans. Oxford: Oxford University Press. Fitzgerald, P. J., Lane, J. W., Thakur, P. H., & Hsiao, S. S. (2006). Receptive field (RF) properties of the macaque second somatosensory cortex: RF size, shape, and somatotopic organization. The Journal of Neuroscience, 26(24), 6485−6495. Flach, R., & Haggard, P. (2006). The cutaneous rabbit revisited. Journal of Experimental Psychology. Human Perception and Performance, 32(3), 717−732. Freeman, E., Driver, J., Sagi, D., & Li, Z. (2003). Top-down modulation of lateral interactions in early vision: Does attention affect integration of the whole or just perception of the parts? Current Biology, 13, 985−999.

Gallace, A., & Spence, C. (2007). The cognitive and neural correlates of “tactile consciousness”: A multisensory perspective. Consciousness and Cognition, 17, 370−407. Geldard, F. A., & Sherrick, C. E. (1972). The cutaneous “rabbit”: A perceptual illusion. Science, 178(57), 178−179. Ghez, C., Gordon, J., Ghilardi, M. F., Christakos, C. N., & Cooper, S. E. (1990). Roles of proprioceptive input in the programming of arm trajectories. Cold Spring Harbor Symposia on Quantitative Biology, 55, 837−847. Haggard, P., Christakou, A., & Serino, A. (2007). Viewing the body modulates tactile receptive fields. Experimental Brain Research, 180(1), 187−193. Henkin, R. I., & Banks, V. (1967). Tactile perception on the tongue, palate and the hand of normal man. In J. F. Bosma (Ed.), Symposium on oral sensation and perception (pp. 182−187). IL, USA: Springfield. Huang, R. S., & Sereno, M. I. (2007). Dodecapus: An MR-compatible system for somatosensory stimulation. Neuroimage, 34(3), 1060−1073. Kaas, A. L., van Mier, H., & Goebel, R. (2007). The neural correlates of human working memory for haptically explored object orientations. Cerebral Cortex, 17, 1637−1649. Kennett, S., Taylor-Clarke, M., & Haggard, P. (2001). Noninformative vision improves the spatial resolution of touch in humans. Current Biology, 11(15), 1188−1191. Khoe, W., Freeman, E., Woldorff, M. G., & Mangun, G. R. (2004). Electrophysiological correlates of lateral interactions in human visual cortex. Vision Research, 44(14), 1659−1673. Klatzky, R. L., & Lederman, S. J. (2003). Representing spatial location and layout from sparse kinesthetic contacts. Journal of Experimental Psychology. Human Perception and Performance, 29(2), 310−325. Kolster, H., Mandeville, J. B., Arsenault, J. T., Ekstrom, L. B., Wald, L. L., & Vanduffel, W. (2009). Visual field map clusters in macaque extrastriate visual cortex. The Journal of Neuroscience, 29, 7031−7039. Laskin, S. E., & Spencer, W. A. (1979a). Cutaneous masking. I. Psychophysical observations on interactions of multipoint stimuli in man. Journal of Neurophysiology, 42(4), 1048−1060. Laskin, S. E., & Spencer, W. A. (1979b). Cutaneous masking. II. Geometry of excitatory and inhibitory receptive fields of single units in somatosensory cortex of the cat. Journal of Neurophysiology, 42(4), 1061−1082. Lederman, S. J., & Klatzky, R. L. (2004). Haptic identification of common objects: Effects of constraining the manual exploration process. Perception & Psychophysics, 66(4), 618−628. Levi, D. M., & Klein, S. A. (1985). Vernier acuity, crowding and amblyopia. Vision Research, 25(7), 979−991. Longo, M. R., Azanon, E., & Haggard, P. (2010). More than skin deep: Body representation beyond primary somatosensory cortex. Neuropsychologia, 48, 655−668. Longo, M. R., & Haggard, P. (2010). An implicit body representation underlying human position sense. Proceedings of the National Academy of Sciences of the United States of America, 107, 11727−11732. Loomis, J. M. (1979). An investigation of tactile hyperacuity. Sensory Processes, 3, 289−302. Lotze, H. (1885). Microcosmus: An essay concerning man and his relation to the world. Edinburgh: T. & T. Clark. Manning, H., & Tremblay, F. (2006). Age differences in tactile pattern recognition at the fingertip. Somatosensory & Motor Research, 23(3–4), 147−155. Maravita, A., Spence, C., & Driver, J. (2003). Multisensory integration and the body schema: Close to hand and within reach. Current Biology, 13(13), R531−R539. Martin, M. (1992). Sight and touch. In T. Crane (Ed.), The contents of experience (pp. 196−215). Cambridge: Cambridge University Press. Merleau-Ponty, M. (1962). Phenomenology of perception. London: Routledge. O'Shaughnessy, B. (1989). The sense of touch. Australasian Journal of Philosophy, 67(1), 37−58. Penfield, W., & Rasmussen, T. (1950). The cerebral cortex of man. New York: Macmillan. Prather, S. C., Votaw, J. R., & Sathian, K. (2004). Task-specific recruitment of dorsal and ventral visual areas during tactile perception. Neuropsychologia, 42(8), 1079−1087. Reed, C. L., Klatzky, R. L., & Halgren, E. (2005). What vs. where in touch: An fMRI study. Neuroimage, 25(3), 718−726. Riggs, K. J., Ferrand, L., Lancelin, D., Fryziel, L., Dumur, G., & Simpson, A. (2006). Subitizing in tactile perception. Psychological Science, 17(4), 271−272. Sadato, N., Pascual-Leone, A., Grafman, J., Ibanez, V., Deiber, M. P., Dold, G., et al. (1996). Activation of the primary visual cortex by Braille reading in blind subjects. Nature, 380(6574), 526−528. Sathian, K., & Zangaladze, A. (2001). Feeling with the mind's eye: The role of visual imagery in tactile perception. Optometry and Vision Science, 78(5), 276−281. Sathian, K., & Zangaladze, A. (2002). Feeling with the mind's eye: Contribution of visual cortex to tactile perception. Behavioural Brain Research, 135(1–2), 127−132. Serino, A., Giovagnoli, G., deVignemont, F., & Haggard, P. (2008). The spatial organisation of tactile perception: Evidence for the existence of a tactile field. Acta Psychologica, 128, 355−360. Shore, D. I., Hall, S. E., & Klein, R. M. (1998). Auditory saltation: A new measure for an old illusion. The Journal of the Acoustical Society of America, 103(6), 3730−3733. Shore, D. I., Spry, E., & Spence, C. (2002). Confusing the mind by crossing the hands. Brain Research. Cognitive Brain Research, 14(1), 153−163. Smythies, J. (1996). A note on the concept of the visual field in neurology, psychology, and visual neuroscience. Perception, 25, 369−371. Spence, C., Pavani, F., & Driver, J. (2004). Spatial constraints on visual-tactile crossmodal distractor congruency effects. Cognitive, Affective & Behavioral Neuroscience, 4(2), 148−169. Spitoni, G. F., Galati, G., Antonucci, G., Haggard, P., & Pizzamiglio, L. (2010). Two forms of touch perception in the human brain. Experimental Brain Research, 207, 185−195.

P. Haggard, G. Giovagnoli / Acta Psychologica 137 (2011) 65–75 Taylor-Clarke, M., Jacobsen, P., & Haggard, P. (2004). Keeping the world a constant size: Object constancy in human touch. Nature Neuroscience, 7(3), 219−220. Tipper, S. P., Phillips, N., Dancer, C., Lloyd, D., Howard, L. A., & McGlone, F. (2001). Vision influences tactile perception at body sites that cannot be viewed directly. Experimental Brain Research, 139(2), 160−167. Van Boven, R. W., Ingeholm, J. E., Beauchamp, M. S., Bikle, P. C., & Ungerleider, L. G. (2005). Tactile form and location processing in the human brain. Proceedings of the National Academy of Sciences of the United States of America, 102(35), 12601−12605. Voisin, J., Michaud, G., & Chapman, C. E. (2005). Haptic shape discrimination in humans: Insight into haptic frames of reference. Experimental Brain Research, 164, 347−356. Wandell, B. A., Dumoulin, S. O., & Brewer, A. A. (2007). Visual field maps in human cortex. Neuron, 56, 366−383.

75

Weinstein, S. (1968). Intensive and extensive aspects of tactile sensitivity as a function of body part, sex, and laterality. In D. R. Kenshalo (Ed.), The skin senses (pp. 195−218). Springfield, IL: Thomas. Westheimer, G., & Hauske, G. (1975). Temporal and spatial interference with vernier acuity. Vision Research, 15, 1137−1141. Westheimer, G., & McKee, S. P. (1977). Spatial configurations for visual hyperacuity. Vision Research, 17, 941−947. Yamamoto, S., & Kitazawa, S. (2001). Reversal of subjective temporal order due to arm crossing. Nature Neuroscience, 4(7), 759−765. Zuidhoek, S., Kappers, A. M., van der Lubbe, R. H., & Postma, A. (2003). Delay improves performance on a haptic spatial matching task. Experimental Brain Research, 149, 320−330.