Implicit Visual Analysis in Handedness Recognition

Implicit Visual Analysis in Handedness Recognition

CONSCIOUSNESS AND COGNITION ARTICLE NO. 7, 478–493 (1998) CC980368 Implicit Visual Analysis in Handedness Recognition Maurizio Gentilucci,1 Elena D...

688KB Sizes 0 Downloads 57 Views

CONSCIOUSNESS AND COGNITION ARTICLE NO.

7, 478–493 (1998)

CC980368

Implicit Visual Analysis in Handedness Recognition Maurizio Gentilucci,1 Elena Daprati, and Massimo Gangitano Institute of Human Physiology, University of Parma, via Volturno 39, I-43100 Parma, Italy In the present study, we addressed the problem of whether hand representations, derived from the control of hand gesture, are used in handedness recognition. Pictures of hands and fingers, assuming either common or uncommon postures, were presented to righthanded subjects, who were required to judge their handedness. In agreement with previous results (Parsons, 1987, 1994; Gentilucci, Daprati, & Gangitano, 1998), subjects recognized handedness through mental movement of their own hand in order to match the posture of the presented hand. This was proved by a control experiment of physical matching. The new finding was that presentation of common finger postures affected responses differently from presentation of less common finger postures. These effects could be not attributed to mental matching movements nor related to richness in hand–finger cues useful for handedness recognition. The results of the present study are discussed in the context of the notion that implicit visual analysis of the presented hands is performed before mental movement of one’s hand takes place (Parsons, 1987; Gentilucci et al., 1998). In this process, hand representation acquired by experience in the control and observation of one’s and other people’s hand gestures is used. We propose that such an immediate recognition mechanism belongs to the class of mental processes which are grouped under the name of intuition, that is, the processes by which situations or people’s intentions are immediately understood, without conscious reasoning.  1998 Academic Press

INTRODUCTION

The use of visual information in guiding and learning movements is fundamental to the production of the highly dextrous hand and finger movements that are typical of primates. Vision of other people’s hand interactions with objects and their matching with one’s own movements is thought to be involved, not only in reproducing, but also in recognizing and understanding an action (Gallese, Fadiga, Fogassi, & Rizzolatti, 1996; Rizzolatti, Fadiga, Gallese, & Fogassi, 1996). Previous behavioral experiments (Parsons, 1987, 1994; Parsons, Fox, Downs, Glass, Hirsch, Martin, Jerabek, & Lancaster, 1995; Gentilucci et al., 1998) have shown that humans, when required to recognize other people’s handedness, mentally rotate their own hand in order to match it with the presented one. Indeed, response times in handedness recognition increase proportionally with the number of arm joints involved in mental rotation and with the length of imagined hand trajectories (Parsons, 1994; Gentilucci et al., 1998). This process, however, is confirmatory, being preceded by a covert visual analysis of the hand by which handedness is automatically identified (Parsons, 1987, Gentilucci et al., 1998). In a previous experiment, we found that, in covert visual analysis of interactingwith-objects hands, right-handers use representations derived from experience in con1 Address reprint requests to Maurizio Gentilucci, Institute of Human Physiology, University of Parma, via Volturno 39, I-43100 Parma, Italy. E-mail: [email protected]. 478 1053-8100/98 $25.00

Copyright  1998 by Academic Press All rights of reproduction in any form reserved.

INTUITION AND MOTOR REPRESENTATION

479

trol and observation of one’s own and other people’s actions (Gentilucci et al., 1998). The aim of the present experiment was to determine whether right-handers use the same type of representation in covert visual analysis of hand gesture (hand-controlbased analysis). Alternatively, covert analysis can be performed on hand–finger cues, that is, on hand–finger details and spatial relations among fingers (hand-cues-based analysis). In order to discriminate between the two hypotheses, an experimental paradigm was devised in which, as in previous experiments (Parsons, 1987, 1994; Gentilucci et al., 1998), right-handed subjects were presented with either right or left hands and forearms by means of slides. Hand and forearm were differently oriented. In addition, in order to emphasize either type of visual analysis, hand–finger configurations were varied in the richness of finger details and spatial relations among fingers (hand-cues-based analysis) as well as in the frequency of assumed hand and finger postures (hand-control-based analysis). A judgment as to whether the right or the left hand was presented was required by pressing a key with the corresponding hand. Response times were measured. We verified whether richness of finger details and spatial relations among fingers or frequency of assumed hand and finger postures caused variation in time of handedness recognition. In order to discriminate between the effects of finger posture on implicit visual analysis of hand and on mental matching, a control experiment of physically matching the same presented hands was performed. EXPERIMENT 1

Methods Ten right-handed subjects (4 females and 6 males, age 21–35) participated in the present experiment. Hand preference was assessed according to the Edinburgh Inventory (Oldfield, 1971). All the subjects were naive as to the purpose of the experiment. The subjects sat on a comfortable chair in front of a table. They placed their head on a head-and-chin rest and positioned their right and left index fingers upon two microswitches placed on the plane of the table, on the right and left of the subject’s midline, respectively. The microswitches were placed 16 cm apart from each other and 47 cm away from the subject’s frontal plane. Subject’s index fingers were extended, whereas the other fingers were flexed. A covering box prevented vision of their hands and forearms. Stimuli were color slides of either right or left real hands and forearms. They were presented by means of a slide projector equipped with tachistoscopic shutter. The slides were projected on a screen, 134 cm distant from the viewer. The slides showed either backs or palms of hands. In addition, the forearm appeared either on the right or on the left part of the slide, originating from its left or right side (Fig. 1). Note that, in order to match the presented forearm position, subjects had either to flex or to extend their forearm at the elbow. For right position of presented right hand and for left position of presented left hand, they had to flex their forearm. For left position of presented right hand and for right position of presented left hand, they had to extend their forearm. Finger posture of the presented hands was varied. Fingers could be extended or flexed. According to the extended fingers, the following configurations were pre-

480

GENTILUCCI, DAPRATI, AND GANGITANO

INTUITION AND MOTOR REPRESENTATION

481

sented (Fig. 1): no finger (fist), index finger, index middle fingers, thumb pinkie, thumb index middle fingers, index middle ring fingers, all fingers (open hand). It should be noted that: (1) richness in details useful for hand analysis increases with increasing number of extended fingers, (2) spatial reference for handedness recognition is mainly provided by visibility of the extended thumb (thumb pinkie, thumb index middle fingers, all fingers (open hand)), (3) no finger (fist), all fingers (open hand), index finger, index middle fingers are configurations showing common finger postures, whereas thumb pinkie, thumb index middle fingers, index middle ring fingers are configurations showing uncommon finger postures (Fig. 1). Stimuli subtended approximately 20° of visual angle. Subjects were free to move their eyes, but not their head. The stimuli were presented for 3 s. The subjects were required to make right–left judgment about the presented hand by pressing either the right or the left microswitch, respectively. Four seconds from slide presentation were allowed for the response. Response time, that is, the interval between stimulus presentation and pressing the microswitch, was collected. Response times longer than 4 s were discarded and the corresponding trials were repeated at the end of each block (see below). Trials in which right–left hand judgment was wrong were also repeated at the end of each block (see below). The experimental design included 4 factors: hand (right vs. left), hand side (palm vs. back), forearm position (right vs. left), finger configuration (fist vs. index finger vs. index middle fingers vs. thumb pinkie vs. index middle ring fingers vs. thumb index middle fingers vs. all fingers). Four blocks of trials were run. In each block, one trial for each of the various levels of the 4 factors was presented. That is, each block consisted of 56 trials. Response times were submitted to an ANOVA whose within-subjects factors were: hand, hand side, forearm position, and finger configuration. Newman-Keuls post hoc test was used. Correlation coefficient was calculated between mean response times in the various experimental conditions and the corresponding numbers of errors of right–left judgment. Results Subjects recognized right hands more quickly than left hands (F(1, 9) ⫽ 5.8, p ⬍ .04, 1444 vs. 1541 ms). Handedness recognition was influenced by hand side (F(1, 9) ⫽ 7.4, p ⬍ .02). It was faster for backs (1403 ms) than for palms (1582 ms). The fact that also forearm position affected handedness recognition is indicated by its interaction with both hand and hand side (F(1, 9) ⫽ 13.9, p ⬍ .005). Response times were longer for palms of right hand whose forearm was placed on the left and for palms of left hand whose forearm was placed on the right (Fig. 2). These postures were the most awkward among the presented ones. FIG. 1. Representation of the finger configurations presented to the subjects. Figures are drawn from color slides presenting real hands. In the right column, backs of right hands with forearms placed on the right are presented. In the left column, palms of left hands with forearms placed on the left are presented. Finger configurations are shown as a function of frequency of posture assumed by fingers in everyday life. In the upper four rows common, in the lower three rows uncommon, postures are shown, respectively.

482

GENTILUCCI, DAPRATI, AND GANGITANO

FIG. 2. Effect of back (䉬) and palm (䉫) presentation on handedness recognition (experiment 1). Modulation of the effect of forearm position (x-axis) on the left hand (upper panel) and on the right hand (lower panel) is shown. Y-axis: response times. Whiskers are standard errors of means.

Finger configuration significantly influenced response times (F(6, 54) ⫽ 3.14, p ⬍ .01). However, the only significant difference was found between index-middlering-fingers configuration and index-finger and open-hand configurations (p ⫽ .06). Moreover, the regression line of the response times as a function of finger configurations, ordered according to frequency of assumed finger posture (Fig. 1), was not significant (Fig. 3, response time ⫽ 1454.1 ⫹ 0.47 ⫻ finger configuration, p ⫽ .29). Finger configuration significantly influenced response time when its interaction with hand side was taken into account (F(6, 54) ⫽ 3.4, p ⬍ .007). Figure 4 shows that responses to the common finger configurations were faster or slower, depending on whether backs or palms were presented, respectively. These variations were significant (p ⬍ .005). This effect was not observed for less common finger configurations, for which no significant variation in handedness recognition was found after presentation of palms and backs. The regression lines of the response times as a function of finger configurations were significant when they were calculated separately for backs and palms. For palm presentation, response time linearly decreased with decreasing frequency of assumed finger posture (Fig. 4, response time ⫽ 1630.4 ⫺ 0.51 ⫻ finger configuration, p ⬍ .05) whereas it increased for back presentation (Fig. 4, response time ⫽ 1298.3 ⫹ 0.78 ⫻ finger configuration, p ⬍ .03).

INTUITION AND MOTOR REPRESENTATION

483

FIG. 3. Effect of finger configurations on handedness recognition (experiment 1). Y-axis: response times. Hand–finger configurations (X-axis) are ordered according to frequency of posture assumed by the fingers. In the left part, common and, in the right part, uncommon finger configurations are reported, respectively. Straight line is the regression line. Other conventions as in Fig. 2.

Errors of right–left judgment were found in 9.5% of trials. Frequency of error increased with response time, to which it was significantly correlated (r ⫽ .57, F(1, 54) ⫽ 26.5, p ⬍ .00004). EXPERIMENT 2

Following Parsons (1994), we assumed that handedness recognition consists of a first phase of implicit visual analysis of the presented hand and of a second confirma-

FIG. 4. Effect of hand–finger configurations on handedness recognition (experiment 1). Modulation of the effect due to back (䉱) and palm (●) presentation is shown. Dashed and continuous lines are the regression lines for back and palm, respectively. Other conventions as in Fig. 3.

484

GENTILUCCI, DAPRATI, AND GANGITANO

tory phase of mental rotation of one’s hand until it matches the presented one. The aim of experiment 2 was to determine which results of experiment 1 were caused by mental rotation of the hand, and, consequently, which results could be likely attributed to implicit hand analysis. In experiment 2, subjects were required to physically match the presented hand with their own hand. Physical matching of hand postures, rather than mental matching, allowed us to control its correct execution. In addition, it is known that time data of physically executed movements are comparable with those of the corresponding mentally executed movements (Jeannerod, 1994; Parsons, 1994). Methods Ten new right-handed subjects (8 females and 2 males), matched for age to those of experiment 1, participated in the experiment. The criteria of subject selection were the same as in experiment 1. Apparatus and stimuli were the same as in experiment 1. The experimental session consisted of two series of four blocks of trials. In each series, slides of only either right or left hands were presented. Order of handedness presentation was counterbalanced among the subjects. At the beginning of each series, subjects were informed about handedness of the presented hands and were required to press a microswitch located on the plane of the table with the index finger of the same hand as that presented by the slides in that series. Subjects assumed the same initial hand posture as in experiment 1. They were required to press also a pedal with their right or left foot. Half of the subjects used the right foot; the other half used the left foot. The subjects were required to match the posture of the presented hand with their own hand, after presentation of each slide, and to release the pedal as soon as they were sure to have assumed the correct hand and finger posture. Each block consisted of 28 trials. That is, one trial for each of the levels of the factors hand side, forearm position, and finger configuration was run. The times of release of the microswitch and of the pedal from stimulus presentation were recorded. The following parameters were analyzed: reaction time, movement time, and response time. Reaction time was the interval between stimulus presentation and release of the microswitch. Movement time was the interval between release of the microswitch and release of the pedal. Response time was the interval between stimulus presentation and release of the pedal; that is, it was the sum of reaction time and movement time. The arm kinematics during matching execution was recorded by using the ELITE system (B.T.S. Milan, Italy). This system consists of two TV cameras detecting infrared reflecting markers at the sampling rate of 50 Hz. Movement reconstruction and computation of the kinematic parameters are described elsewhere (Gentilucci, Daprati, Toni, Chieffi, & Saetti, 1995). In the present study, one marker placed on the wrist was used in order to study the kinematics of the arm. The kinematic parameter analyzed in the present experiment was arm movement time. Arm movement was considered to start and stop in those samples in which displacement of the marker placed on the wrist became greater and smaller than 0.4 mm (spatial accuracy of the ELITE system, Gentilucci et al., 1995).

INTUITION AND MOTOR REPRESENTATION

485

Separate ANOVAs with the same factors as in experiment 1 were performed on reaction time, movement time, and arm movement time. Correlation coefficient was calculated between mean response times in the various experimental conditions and the corresponding numbers of errors of matching. Results Unlike what was found in experiment 1, matching was not significantly affected by the factor hand, that is, no significant difference between right and left hand was found for reaction time (right hand 755.3 ms, left hand 761.9 ms), movement time (right hand 950.8 ms, left hand 1117.6 ms), and arm movement time (right hand 1456.8 ms, left hand 1500.5 ms). As was the case for handedness recognition in experiment 1, physical matching was significantly faster for backs (reaction time 741.0 ms, movement time 970.0 ms) than for palms (reaction time 775.8 ms, F(1, 9) ⫽ 5.98, p ⬍ .04; movement time 1098.4 ms, F(1, 9) ⫽ 12.09, p ⬍ .007). The interaction among hand, hand side, and forearm position was also significant. However, this was found only for movement time (F(1, 9) ⫽ 22.32, p ⬍ .001). Movement times were longer for palms of right hand whose forearm was placed on the left and of left hand whose forearm was placed on the right (Fig. 5), that is, for the most awkward hand postures. Finger configuration significantly affected both reaction time (F(6, 54) ⫽ 3.73, p ⬍ .004) and movement time (F(6, 54) ⫽ 9.45, p ⬍ .00001). The regression lines relating reaction times to finger configurations (reaction time ⫽ 705.6 ⫹ 0.80 ⫻ finger configuration, p ⬍ .01) and movement times to finger configurations (movement time ⫽ 804.7 ⫹ 0.89 ⫻ finger configuration, p ⬍ .006) were significant (Fig. 6). Note that, in experiment 1, the corresponding regression line was not significant (compare Fig. 6 with Fig. 3). The interaction between finger configuration and hand side reached significance for movement time (F(6, 54) ⫽ 3.25, p ⬍ .008). However, unlike what was found in experiment 1, no significant difference between palm and back presentation was found for fist and open-hand configurations, whereas a significant difference was found for thumb-index-middle-fingers and index-middle-ringfingers configurations. In addition, Fig. 7 shows that the regression lines of reaction times and movement times as a function of finger configuration were approximately parallel when they were calculated separately for back and palm. They were all significant (Fig. 7, palm presentation: reaction time ⫽ 718.0 ⫹ 0.77 ⫻ finger configuration, p ⬍ .00001; back presentation: reaction time ⫽ 693.0 ⫹ 0.83 ⫻ finger configuration, p ⬍ .02, palm presentation: movement time ⫽ 856.6 ⫹ 0.79 ⫻ finger configuration, p ⬍ .0003; back presentation: movement time ⫽ 797.1 ⫹ 0.80 ⫻ finger configuration, p ⬍ .00001). Note that, in experiment 1 the corresponding regression lines diverged with increasing frequency of assumed finger posture, instead of being parallel, as in the present experiment (compare Fig. 7 with Fig. 4). As expected, arm movement time was influenced only by forearm position, that is, by flexion–extension forearm movements. Indeed, the interaction between hand and forearm position was significant (F(1, 9) ⫽ 47.94, p ⬍ .00007). As shown in Fig. 8, forearm flexion was faster than forearm extension. Arm movement time was longer than movement time. This apparently surprising finding was due to the fact

486

GENTILUCCI, DAPRATI, AND GANGITANO

FIG. 5. Effect of back (䉬) and palm (䉫) presentation on response times (upper panels), reaction times (lower left panels), and movement times (lower right panels) in the physical-matching task (experiment 2). Modulation of the effect of forearm position (x-axis) on the left hand and the right hand is shown. Other conventions as in Fig. 2.

that subjects released the pedal when, after the ballistic phase of movement, they felt sure to be reaching the correct posture. Thus, since movement time was the measure of the elaboration of a motor program more than of its execution, it could be used to measure the time of the corresponding mentally executed movement (Jeannerod, 1994).

FIG. 6. Effect of finger configurations on response times (upper panel), reaction times (lower left panel), and movement times (lower right panel) in the physical-matching task (experiment 2). Regression line: response time ⫽ 1508.1 ⫹ 0.95 ⫻ finger configuration, P ⬍ .00001. Other conventions as in Fig. 3.

INTUITION AND MOTOR REPRESENTATION

487

488

GENTILUCCI, DAPRATI, AND GANGITANO

Errors of matching were observed in 8.1% of trials. Frequency of error increased with response time to which it was significantly correlated (r ⫽ .58, F(1, 54) ⫽ 26.9, p ⬍ .00003). DISCUSSION

We assumed that handedness recognition consists of two successive phases (Parsons, 1994; Parsons et al., 1995; Gentilucci et al., 1998). A first phase of implicit visual analysis of the presented hand is followed by a second confirmatory phase of mental matching of one’s hand with the presented one. The main aim of the present study was to determine the type of hand representation used to recognize handedness during implicit hand analysis. The finding that handedness recognition in right-handers was faster for presentation of right hands than for left hands is in accordance with the results of previous studies (Parsons, 1987, 1994; Gentilucci et al., 1998). However, subjects were not correspondingly faster when they physically matched postures of right hands. Taken together, these results suggest that implicit visual analysis, more than mental matching, was influenced by handedness of the presented hand and they confirm that immediate recognition of the most commonly used hand is easier (Gentilucci et al., 1998). Response times in handedness recognition were longer when palms were presented than when backs were presented. During the experimental session, subjects’ hands were prone and the backs of their hands faced subjects. Consequently, mental rotation of wrist was performed only after palm presentation. This produced an increase in response time. This explanation is confirmed by the results concerning physical matching movements (experiment 2). In handedness recognition, response times increased mainly for right palms whose forearm appeared on the left part of the slide and for left palms whose forearm appeared on the right. These positions corresponded to the most awkward hand postures among the presented ones. In order to be matched with one’s hand, they require both forearm extension and wrist extrarotation, starting from the hand’s working position. In experiment 2, these positions were physically reproduced by spending the longest time. Physical matching in experiment 2 was influenced by finger configuration. Both reaction times, related to movement preparation, and response times, related to movement control, increased with decreasing frequency of assumed finger posture. The same increase was observed after both back and palm presentation. In contrast, handedness recognition in experiment 1 was differently affected by finger configurations, according to whether back or palm was presented. For presentations of hand backs, response time increased with decreasing frequency of assumed finger posture, whereas, for palm presentation, it decreased. In other words, in handedness recogni-

FIG. 7. Effect of finger configurations on response times (upper panel), reaction times (lower left panel), and movement times (lower right panel) in the physical-matching task (experiment 2). Modulation of the effect due to back (䉱) and palm (●) presentation is shown. Regression lines: back presentation (dashed line), response time ⫽ 1444.6 ⫹ 0.94 ⫻ finger configuration, P ⬍ .001, palm presentation (continuous line), response time ⫽ 1573.1 ⫹ 0.89 ⫻ finger configuration, P ⬍ .02. Other conventions as in Fig. 4.

INTUITION AND MOTOR REPRESENTATION

489

490

GENTILUCCI, DAPRATI, AND GANGITANO

FIG. 8. Effect of (right 䉬 and left 䉫) forearm position on arm movement time in the physicalmatching task (experiment 2). Modulation of the effect due to hand is shown. Other conventions as in Fig. 2.

tion, common hand–finger postures, such as fist, open hand (all finger), pointing index finger, and index and middle fingers (postures used when smoking a cigarette or making a V-sign), speeded and delayed response when backs and palms were presented, respectively. This effect was clearly greater than that observed during physical matching. Conversely, for uncommon postures (e.g., thumb-pinkie configuration) no difference was found between palm and back presentation. On the basis of these results, we can exclude that, in handedness recognition, finger posture affected duration of the second confirmatory stage of mental rotation. Conversely, the data suggest that finger postures influenced handedness recognition in the first unconscious stage of hand visual analysis. The finding that visual analysis of commonly assumed finger postures facilitated recognition of hand backs and made difficult recognition of hand palms can be explained in two ways. The first possibility is that hand backs were recognized more easily than hand palms because subjects, at the initial stage of handedness recognition, used a hand representation constructed by proprioceptive information about actual position of their own hands. If this were true, the index-finger configuration, that was also the posture initially assumed by subjects’ hands during the experimental session, should be facilitated with respect to the others. This was not the case (see Fig. 4). The second, more plausible possibility is that a strict relation between implicit visual analysis of hand and visual control of hand gesture exists (hand-control-based analysis). When one’s hand assumes a common posture at rest or in interactions with other people, hand backs rather than hand palms are commonly presented to the viewer. This can make recognition of backs progressively easier. In contrast, since the visual appearance of the palm is less familiar, the palm of one hand can be confused with the back of the other, because they share the same shape. The error in immediate recognition of handedness might cause reanalysis of the presented hand after mental matching. Such a reanalysis was likely time consuming.

INTUITION AND MOTOR REPRESENTATION

491

The finding that no difference between back and palm presentation was found for uncommon postures can be explained by postulating that, for these configurations, implicit visual analysis was performed on hand details and on spatial relations among fingers (hand-cues-based analysis). Duration of this type of analysis did not likely differ between backs and palms. However, note that the strategy that was followed was one of using primarily hand-control-based representations. In fact, if hand-cuesbased analysis had been primarily performed, fist and open-hand configurations, which differ in richness of visual details, had to produce different effects. Conversely, open-hand and thumb-pinkie configurations, showing similar spatial references for handedness recognition (i.e., relative position between thumb and pinkie), had to produce similar effects. This was not observed. In conclusion, we propose that handedness recognition is primarily based on hand representations derived from control and/or observation of one’s own and other people’s hand gestures. Note that the common postures tested in the present experiment were also meaningful postures. Although a strict correspondence between common and meaningful postures generally exists, their effects on handedness recognition were not dissociated from each other. Further experiments will be carried out in which meaningful, but uncommon, hand postures will be presented to subjects performing a task of handedness judgment. The existence of pragmatic hand representations may be supported by neurophysiological studies. Neurons recorded in monkey ventral premotor cortex (diPellegrino, Fadiga, Fogassi, Gallese, & Rizzolatti, 1992; Gallese et al., 1996; Rizzolatti et al., 1996) and superior temporal sulcus (Perrett, Rolls, & Caan, 1982; Perrett, Harries, Bevan, Thomas, Benson, Mistlin, Chitty, Hietanen, & Ortega, 1989; Perrett, Mistlin, Harries, & Chitty, 1990) become active when the animal observes goal-directed hand actions performed by another individual. The premotor neurons (‘‘mirror’’ neurons, diPellegrino et al., 1992) differ from the superior temporal sulcus neurons in that they respond also to the same action when actively performed by the animal. The activation of motor and temporal areas during observation of goal- directed hand actions has been found also in humans (Fadiga, Fogassi, Pavesi, & Rizzolatti, 1995; Rizzolatti, Fadiga, Matelli, Bettinardi, Paulesu, Perani, & Fazio, 1996). The properties of the neurons recorded in the two areas led to the suggestion that the superior temporal sulcus region may be involved in constructing a pictorial representation of hand action, whereas the ventral premotor area may be related to a pragmatic representation (Gallese et al., 1996). In the present experiment, both types of hand representations (pictorial and pragmatic) could be used for handedness recognition. However, hand gestures, but not hand–object interactions (Gentilucci et al., 1998), were tested in the present experiment. Nevertheless, provided that the neurons of the ventral premotor cortex and of the superior temporal sulcus region are likely involved in understanding the meaning of goal-directed hand actions (Gallese et al., 1996), areas of human premotor and/or temporal cortices may be involved in recognizing hand gestures. In support of this hypothesis, single neuron recording studies have shown that some superior temporal sulcus neurons were activated when the animal was presented with the hand of another monkey in a given posture (Carey, Perrett, & Oram, 1997). Finally, it is important to stress the distinction between implicit (and automatic)

492

GENTILUCCI, DAPRATI, AND GANGITANO

recognition of action (and of hand gesture) and motor imagery. Although the two processes can share some common neural substrates (Grafton, Arbib, Fadiga, & Rizzolatti, 1996; Decety, Grezes, Costes, Perani, Jeannerod, Procyk, Grassi, & Fazio, 1997), they are, in our opinion, functionally distinct. Automatic action recognition relies on ‘‘internal models’’ by which an action is represented. Internal models are constructed by motor experience (Wolpert, Ghahramani, & Jordan, 1995), probably developing from a biological and psychological state already present in newborns (De Vries, Visser, & Prechtl 1982; Spelke, Breinlinger, Macomber, & Jacobson, 1992). Motor imagery is a process more related to actual, physically executed action. Automatic action recognition uses an abstraction of the action, that is, a representation that maintains in memory only the peculiar features of that action (e.g., aim, body part, and postures required for that action, etc.). In contrast, motor imagery follows the same rules of the real movement and it varies for a given action, according to its actual constraints. Thus, immediate action recognition can be related to the mental processes that are grouped under the name of intuition, that is, the process of understanding situations or people’s intentions immediately, without conscious reasoning. Understanding emerges from automatic comparison of perceived external world and internal models. Conversely, mental imagery can be related to the confirmation processes by which conscious knowledge of the external word is reached, moving from what has been intuited, that is, immediately and unconsciously understood. ACKNOWLEDGEMENT We are grateful to Dr. F. Benuzzi and Dr. L. Bertolani for the help in carrying out experiment 2. We thank Dr. L. Fogassi and Dr. V. Gallese for comments on the manuscript. The work was supported by grants BIOMED BMH4-CT95-0789, CNR (Centro Nazionale delle Ricerche) to the Institute of Human Physiology of Parma, and MURST (Ministero dell’ Universita` e della Ricerca Scientifica e Tecnologica) to the Institute of Human Physiology of Parma and to M. Gentilucci.

REFERENCES Carey, D. P., Perrett, D. I., & Oram, M. W. (1997). Recognizing, understanding and reproducing action. In F. Boller & J. Grafman (Eds.), Handbook of neuropsychology, Vol. 11 (pp. 111–129). The Netherlands: Amsterdam, Elsevier. Decety, J., Grezes, J., Costes, N., Perani, D., Jeannerod, M., Procyk, E., Grassi, F., & Fazio, F. (1997) Brain activity during observation of action. Influence of action content and subject’s strategy. Brain, 120, 1763–1777. De Vries, J. I. P., Visser, G. H. A., & Prechtl, H. F. R. (1982). The emergence of fetal behavior. I. Quantitative aspects. Early Human Development, 7, 176–180. diPellegrino, G., Fadiga, L., Fogassi, L., Gallese, V., & Rizzolatti, G. (1992). Understanding motor events: A neurophysiological study. Experimental Brain Research, 91, 176–180. Fadiga, L., Fogassi, L., Pavesi, G., & Rizzolatti, G. (1995). Motor facilitation during action observation: A magnetic stimulation study. Journal of Neurophysiology, 73, 2608–2611. Gallese, V., Fadiga, L., Fogassi, L., & Rizzolatti, G. (1996). Action recognition in the premotor cortex. Brain, 119, 593–609. Gentilucci, M., Daprati, E., & Gangitano, M. (1998). Right-handers and left-handers have different representations of their own hand. Cognitive Brain Research, 6, 185–192.

INTUITION AND MOTOR REPRESENTATION

493

Gentilucci, M., Daprati, E., Toni, I., Chieffi, S., & Saetti, M. C. (1995). Unconscious updating of grasp motor program. Experimental Brain Research, 105, 291–303. Grafton, S. T., Arbib, M. A., Fadiga, L., & Rizzolatti, G. (1996). Localization of grasp representations in humans by positron emission tomography. 2. Observation compared with imagination. Experimental Brain Research, 112, 103–111. Jeannerod, M. (1994). The representing brain. Neural correlates of motor intention and imagery. Behavioral and Brain Sciences, 17, 187–245. Oldfield, R. C. (1971). The assessment and analysis of handedness: The Edinburgh inventory. Neuropsychologia, 9, 97–113. Parsons, L. M. (1987). Imaged spatial transformations of one’s hands and feet. Cognitive Psychology, 19, 178–241. Parsons, L. M. (1994). Temporal and kinematic properties of motor behavior reflected in mentally simulated action. Journal of Experimental Psychology, 26, 709–730. Parsons, L. M., Fox, P. T., Downs, J. H., Glass, T., Hirsch, T. B., Martin, C. C., Jerabek, P. A., & Lancaster, J. L. (1995). Use of implicit motor imagery for visual shape discrimination as revealed by PET. Nature, 375, 54–58. Perrett, D. I., Harries, M. H., Bevan, R., Thomas, S., Benson, P. J., Mistlin, A. J., Chitty, A. K., Hietanen, J. K., & Ortega, J. E. (1989). Frameworks of analysis for the neural representation of animate objects and actions. Journal of Experimental Biology, 146, 87–113. Perrett, D. I., Mistlin, A. J., Harries, M. H., & Chitty, A. J. (1990). Understanding the visual appearance and consequence of hand actions. In M. A. Goodale (Ed.), Vision and action: The control of grasping (pp. 163–180), Norwood, NJ: Ablex. Perrett, D. I., Rolls, E. T., & Caan, W. (1982). Visual neurones responsive to faces in the monkey temporal cortex. Experimental Brain Research, 47, 329–342. Rizzolatti, G., Fadiga, L., Gallese, V., & Fogassi, L. (1996). Premotor cortex and the recognition of motor actions. Cognitive Brain Research, 3, 131–141. Rizzolatti, G., Fadiga, L., Matelli, M., Bettinardi, V., Paulesu, E., Perani, D., & Fazio, F. (1996). Localization of grasp representations in humans by PET. 1. Observation versus execution. Experimental Brain Research, 111, 246–252. Spelke, E. S., Breinlinger, K., Macomber, J., & Jacobson, K. (1992). Origins of knowledge. Psychological Review, 99, 605–632. Wolpert, D. M., Ghahramani, Z., & Jordan, M. I. (1995). An internal model for sensorimotor integration. Science, 269, 1880–1882. Received March 31, 1998