Evidence of beat perception via purely tactile stimulation

Evidence of beat perception via purely tactile stimulation

BR A IN RE S E A RCH 1 2 23 ( 20 0 8 ) 5 9 –6 4 a v a i l a b l e a t w w w. s c i e n c e d i r e c t . c o m w w w. e l s e v i e r. c o m / l o c...

513KB Sizes 1 Downloads 37 Views

BR A IN RE S E A RCH 1 2 23 ( 20 0 8 ) 5 9 –6 4

a v a i l a b l e a t w w w. s c i e n c e d i r e c t . c o m

w w w. e l s e v i e r. c o m / l o c a t e / b r a i n r e s

Research Report

Evidence of beat perception via purely tactile stimulation Renaud Brochard a,⁎, Pascale Touzalin b , Olivier Després b , André Dufour b a

Laboratoire SMPS, Université de Bourgogne, POLE AAFE, Esplanade Erasme, 21000 Dijon, France Laboratoire d'Imagerie et Neurosciences Cognitives, UMR 7191 CNRS, ULP, 21 rue Becquerel, 67087 Strasbourg, France

b

A R T I C LE I N FO

AB S T R A C T

Article history:

Humans can easily tap in synchrony with an auditory beat but not with an equivalent visual

Accepted 20 May 2008

rhythmic sequence, suggesting that the sensation of meter (i.e. of an underlying regular

Available online 28 May 2008

pulse) may be inherently auditory. We assessed whether the perception of meter could also be felt with tactile sensory inputs. We found that, when participants were presented with

Keywords:

identical rhythmic sequences filled with either short tones or hand stimulations, they could

Auditory perception

more efficiently tap in synchrony with strongly rather than weakly metric sequences. These

Tactile perception

observations suggest that non-musician adults can extract the metric structure of purely

Sensorimotor coordination

tactile rhythms and use it to tap regularly with the beat induced by such sequences. This

Meter

finding represents a challenge for present models of rhythm processing.

Beat

© 2008 Elsevier B.V. All rights reserved.

Rhythm

1.

Introduction

From a very young age, most human beings will spontaneously tap their feet or move their bodies in synchrony with music (Trehub and Hannon, 2006; Drake et al., 2000a,b). This kind of motor behavior, described in all human cultures, reflects the processing of meter, which is regarded as a universal property of music and, more importantly, as a specific trait of human cognition (Drake, 1998; see London (2004) for a review). No other animal species seems to display this rhythmic ability, although many are able to produce regular movements (Patel, 2006). Perceptually, meter is experienced as the alternation of strong and weak beats according to an underlying regular pulse. More precisely, certain time positions within a metric musical sequence seem to be accented (strong beats) at regular time intervals. One accent every two beats leads to the sensation of a binary (or duple) meter, whereas one accent every three beats leads to a ternary (or triple) meter. Metric expectancies are automatically superimposed on any rhythmic sequence, depending on the meter inferred from the structure of the first

events of the sequence (Keller and Repp, 2005; Repp, 2005). Metric accents may arise from differences in the physical attributes of the adjacent sounding events (longer, higher or louder tones). However, as the accenting process is subjective in essence, it also occurs if all the sounds of a regular sequence are physically identical (Povel and Okkerman, 1981; Brochard et al., 2003), as well as in the absence of certain sounding events (leading to the sensation of syncopation, Snyder and Large, 2004; Jongsma et al., 2005). The cerebral response to perceptual changes, occurring on stronger beats, is greater than that to weaker metric positions (Abecasis et al., 2005; Zanto et al., 2006; Potter et al., in press). Recently, Patel et al. (2005) showed that it was almost impossible for human participants to extract the metric structure within the visual modality. When presented with sequences composed of short tones, participants had no difficulty synchronizing finger taps with clearly binary sound sequences, but they could not do so when presented with equivalent visual stimuli (flashes). These findings indicated that the processing of metric information may originate in the close and specific relationships between the auditory and sensorimotor

⁎ Corresponding author. E-mail address: [email protected] (R. Brochard). 0006-8993/$ – see front matter © 2008 Elsevier B.V. All rights reserved. doi:10.1016/j.brainres.2008.05.050

60

BR A IN RE S EA RCH 1 2 23 ( 20 0 8 ) 5 9 –64

systems. Similar relationships between the visual and sensorimotor systems, however, were not apparent, at least in the time domain. The human inability to infer meter from visual information was recently confirmed in adult participants presented with short ambiguous rhythmic sequences whose components could potentially be subjectively accented in a binary or ternary way (Phillips-Silver and Trainor, 2007). When preceded by an auditory sequence strongly stating a specific meter (either binary or ternary), grouping accents could be disambiguated according to the meter of this auditory primer. This effect was also observed when participants were asked to bounce their legs to a specific meter. However watching people move according to a binary or ternary meter had no effect on auditory accenting in motionless participants. Similar conclusions could be drawn from 7-month old infants (Phillips-Silver and Trainor, 2005). It is unclear, however, if the ability to process meter is restricted to the auditory modality or if humans can feel the beat (i.e. the metric structure) outside the sound domain, for example through a different sensory modality. To answer this question, we compared auditory with tactile stimulations using an adaptation of the experimental procedure described by Patel et al. (2005). Participants were asked to tap with their right index finger as regularly as possible in synchrony with strongly metric patterns (Strongly Metric Sequences or SMS, see Figs. 1 and 2), in which the beat was easy to follow, or with weakly metric patterns (Weakly Metric Sequences or WMS), in which the beat was hard to extract from the temporal structure of all the rhythmic events. These rhythmic sequences were composed of identical short tones (auditory stimuli) or quick tickling of the tip of their left finger (tactile stimuli). If participants were able to extract metric information from a purely tactile stimulation, we expected their tapping to parallel the pattern of motor responses measured with auditory stimuli. In addition, two other types of sequences served as controls. The first (I-800 condition in Patel et al. (2005)) consisted of the presentation of a sequence of the same length as that of

the rhythmic sequences but completely isochronous (InterOnset Interval = 800 ms). This condition was designed to compare sensorimotor performance with regular sequences in the auditory and tactile domains since the relative superiority of sensorimotor coordination with an isochronous pace made of tactile and auditory inputs is unclear (see for example Al-Attar et al. (1998); or Müller et al. (2008)). The second control condition (absent from Patel et al. (2005)) consisted of an inducer segment composed of only 9 initial isochronous beats (IOI = 800 ms) followed by a period of silence during which participants were asked to continue tapping with the same pace as that of the inducer (see Fig. 1). This condition was designed to estimate how participants would keep a steady pace in the absence of any sensory stimulation. Pacing performance in silence was thus expected to be the poorest of all conditions since the to-be-produced interval (800 ms) was delineated by existing onsets of events in the three other conditions.

2.

Results

We performed an ANOVA on inter-tap variability (Standard Deviation of inter-tap intervals), with the type of sequence (isochronous, SMS, WMS, silence) and the sensory modality (auditory vs. tactile) as within-subject factors (Fig. 3). This analysis showed that only the type of sequence had a significant main effect (F(3, 27) = 26.053, p b 0.001), whereas sensory modality did not (F(3, 27)= 0.37, p = 0.78), and the interaction between the two factors was not statistically significant (F(1, 9) = 3.12, p = 0.11), these findings are in good agreement with values of tapping accuracy measured under equivalent auditory conditions (Patel et al., 2005). However, results using visual stimuli did not resemble auditory stimulation (Patel et al., 2005). Whereas tapping with the beat was more accurate with auditory SMS compared to WMS, no statistical difference was found between the visual SMS and WMS. In our experiment, the motor performance measured with tactile stimulation parallels very

Fig. 1 – Representation of the four experimental conditions. Each vertical bar represents one sensory event (auditory or tactile stimuli). The dotted arrows represent the time interval between two beats (Inter-Onset Interval = 800 ms). For strongly and weakly metric sequences, the 10 segments following the inducer were randomly selected from two sets of 15 segments (see Fig. 2).

BR A IN RE S E A RCH 1 2 23 ( 20 0 8 ) 5 9 –6 4

61

Fig. 2 – Representation of the 15 possible strongly (left panel) or weakly (right panel) metrical segments that could be randomly concatenated in rhythmic sequences. All the sensory events (bars) were identical and separated by silences (small squares symbolize missing events). The events occurring on the first beat of each measure (every 800 ms) are displayed in black for visual convenience only.

closely the pattern of performance in corresponding auditory conditions. Regardless of the sensory modality, tapping with the beat was significantly less variable when the meter could be clearly derived from the sequence (SMS) than when it was less clear (WMS) (Newman–Keuls post-hoc test, p b 0.05). This suggests that an easy to infer meter helped participants pace their finger movements at a regular time interval (800 ms). However, in agreement with previous findings (Patel et al., 2005), tapping accuracy in SMS (averaged on both modalities) was significantly poorer when compared with isochronous sequences (Newman– Keuls post-hoc test, p b 0.05). Finally, in the “silence” condition, in which participants were instructed to continue tapping at a regular pace after the induced beat was stopped, motor productions were the most variable even though, as for the 3 other conditions, no significant difference was

found between modalities. This suggests that the representation of a specific time interval to be produced was encoded with the same precisions from sound as from tactile stimulations.

3.

Discussion

The ability to process meter has long been thought to be restricted to the auditory modality, since a pulse sensation could not be experienced with visual stimuli (Patel et al., 2005; Phillips-Silver and Trainor, 2007). Here we investigated the potential existence of meter within the somesthetic domain. A group of participants presented with identical rhythmic sequences of events displayed via the auditory or tactile modality was asked to synchronize finger tapping to the inferred beat of each sequence. We found that participants were able to

62

BR A IN RE S EA RCH 1 2 23 ( 20 0 8 ) 5 9 –64

Fig. 3 – Inter-tap variability (mean SD ± SEM) of auditory and tactile responses to the four types of sequences. The “silence” condition corresponds to tapping during a silent period following a 9-beat regular inducer sequence (see Experimental procedures for details).

abstract the metric structure from tactile rhythmic sequences as efficiently as from equivalent auditory patterns. Indeed, during debriefing, almost every participant reported having felt the pulse of most tactile sequences. To our knowledge, this is the first experimental evidence of the ability of humans to extract more than one level of temporal periodicity out of purely tactile sequences. Neither formal musical expertise nor explicit knowledge was needed to perform this task since all participants were nonmusicians. Patel et al. (2005) reported that two participants could effectively tap in synchrony with metric visual patterns, but both were aware of the temporal structure of the stimuli and may have adapted their perceptual strategies to perform this task. In assessing the tapping performance of all our participants, we found that processing metric information from tactile inputs was as effortless and spontaneous as from auditory stimuli. Similar to previous results (Patel et al., 2005), we found that our participants were not more accurate in the presence of strongly metric than isochronous sequences. However our participants were not rhythm experts. Musicians have been found to be more sensitive to meter hierarchies than nonmusicians (Vuust et al., 2005; Drake et al., 2000b; Jongsma et al., 2005). Therefore nonmusicians tend to process time structure less hierarchically and more sequentially. Nevertheless, tapping to WMS was significantly less regular than to SMS, suggesting that metric information actually helps keep an accurate regular pace in the presence of rhythmic patterns. Meter processing may be grounded within the close relationships established, early in life, between body movements and the auditory system (Fraisse, 1982; Trehub and Hannon, 2006). In addition, the vestibular system may mediate this relationship during the processing of metric information (PhillipsSilver and Trainor, 2005, 2007). The latter finding, however, was inferred indirectly from differences in rhythm processing in the presence (or absence) of body movements during auditory presentations. In our study, participants were instructed to remain

as still as possible, with the only movement allowed being finger tapping, making direct stimulation of the vestibular system during the tactile trials unlikely. Therefore, if the vestibular system is implicated in meter processing, it very likely established the same types of relationships with the somesthetic system as with the auditory system. Sensorimotor synchronization was recently compared with a strictly isochronous pacing signal provided in the auditory modality (binaural sound stimuli) or applied on the hand or toe of the participant (Müller et al., 2008). Of the three dipole sources underlying magnetic brain activity measured during performance, two were common to the auditory and tactile modalities, suggesting that sensorimotor coordination may be partly realized via modality-independent cerebral networks. The third cerebral source, however, was situated in a more superior and anterior location in the auditory than in the somesthetic contexts, suggesting that modality-specific brain structures also may be involved in synchronization with isochronous sequences. Therefore, it would now be important to reconsider the functional commonalities and specificities of these neural networks in the light of the development of expectancies for higher temporal periodicities out of complex tactile and/or auditory metric sequences. In a recent brain imaging study, Grahn and Brett (2007) reported that several cerebral structures involved in motor processing were activated during meter perception. It was suggested that metric expectancies could result from specific interactions between sensory inputs and amodal processing of time intervals realized by the basal ganglia (Patel, 2006). Our finding, that meter can be extracted from a non-auditory sensory modality, is in agreement with this hypothesis. Several current cognitive models of rhythm perception are based on coupled oscillators entrained with the temporal structure of the sensory inputs (Large and Jones, 1999). Little is known, however, about the neurobiological nature of these oscillatory mechanisms. Moreover, these models are based almost exclusively on auditory research, although they were meant to be applied to other sensory modalities (see Jones, 1976; Jones and Boltz, 1989). Our demonstration of meter perception in response to purely tactile input needs to be incorporated in these models. It is still unclear whether meter is a cognitive trait exclusively found in humans (see Trehub and Hannon, 2006). The processing of metric information may be based on widespread synchronization mechanisms shared with non-human species (e.g. flashing of fireflies; Buck, 1988 or birds; see Patel, 2006). Consequently, the absence in humans, in the visual domain, of such general capacity of synchronization on multiple temporal hierarchies remains to be explained.

4.

Experimental procedures

4.1.

Participants

The participants consisted of 10 right-handed volunteers (6 females, 4 males) who had not received formal music education, could not read music scores, and could not play an instrument. All participants reported normal hearing and none had previously participated in a psychology experiment.

BR A IN RE S E A RCH 1 2 23 ( 20 0 8 ) 5 9 –6 4

4.2.

Stimuli

4.2.1.

Auditory stimulation

Participants were presented with tone sequences adapted from three of the seven types of auditory sequences used by Patel et al. (2005). Digitized sounds were played through headphones by means of a computer sound card. In all sequences, the tones consisted of identical sine waves of the same duration (50 ms, with 10 ms onset and offset ramps), frequency (440 Hz) and loudness level (70 dB SPL). All sequences started with an inducer, consisting of 9 beats (i.e. 9 tones separated by 800 ms IOI) followed immediately by 10 randomly concatenated rhythmic segments chosen from among 15 distinct segments belonging to the same condition (see Figs. 1 and 2; Patel et al., 2005). Each segment consisted of four 800 ms “measures”, each composed of one to four tones separated by multiples of 200 ms IOI. When absent, a tone was replaced with silence of the same duration. The first tone of each measure represented the beat of the sequence (black bars in Fig. 2). The segments were strictly identical to those described by Patel et al. (2005), based on previous models (Povel and Essens 1985). In the first condition (SMS), the first of the four tones was always physically present in each segment. Such sequences have been shown to induce a strong feeling of a beat occurring at a regular pace, or every 800 ms (Povel and Essens 1985). Using this type of rhythmic auditory sequence, listeners performed very accurate finger taps in synchrony with the beat (Patel et al., 2005). In the second condition (WMS), the first tone of the second or third measure (i.e. the second or third beat) was sometimes absent. Using this type of rhythmic auditory sequence, listeners had greater difficulty tapping in synchrony with the induced auditory beat every 800 ms (Patel et al., 2005). In the third condition (isochronous), the 10 concatenated segments following the inducer were identical, each consisting of four tones separated by an interval of 800 ms IOI. This sequence, which was completely isochronous, was used as a control for sensorimotor synchronization at a regular pacing. In the fourth condition (silence), which was not present in the study of Patel et al. (2005), the sequence stopped after the ninth event (beat) of the inducer and the participants were asked to continue to tap regularly at the same pace as that of the inducer (800 ms IOI) until they heard a stop signal, which occurred 32 s after the inducer.

4.2.3.

Tactile stimulation

The tactile sequences were strictly identical to the auditory sequences described above, except that each tone was replaced by a tactile stimulation to the participant's left index finger. The tactile stimulator consisted of a metallic stem, 2 mm long and 1 mm in diameter, which rose from a flat metal surface (2× 1 cm) for 50 ms. The stimulator was connected to a computer via an electronic interface. During the entire block, participants wore soundproof headphones, which prevented them from hearing their response tapping. The headphones that delivered the auditory stimuli were similarly isolated.

4.3.

Experimental procedure

Participants were instructed to tap as regularly as possible in synchrony with the beat of the sequence, as prescribed by the

63

inducer. Each sequence was separated by a 30-second resting period. Except for finger tapping, all subjects were instructed to move their bodies as little as possible during sequence presentation. Each block consisted of only auditory or tactile stimuli, with each experimental condition presented 5 times in random order within each block, and there was a break of at least 20 min between the blocks. The order of the two same-modality blocks was counterbalanced within the group of participants.

4.4.

Tapping measurement

Participants tapped with their right index fingers on the space bar of a computer keyboard. Each tap onset was recorded from the beginning of the sequence, and these tap times were compared with the expected metrical position (every 800 ms), whether or not a sensory event was physically present. Inter-tap intervals (ITIs) shorter than 400 ms (considered as double-taps) or longer than 1600 ms (considered as missing beats) were automatically excluded. Double taps and missing beats, however, were quite rare, each representing an average of 0.1% of the total motor productions under auditory and tactile conditions. To compare our results with those of Patel et al. (2005), we used the standard deviation of ITIs as the dependent variable.

REFERENCES

Abecasis, D., Brochard, R., Granot, R., Drake, C., 2005. Differential brain responses to metrical accents in isochronous auditory sequences. Music Percept. 22 (3), 549–562. Al-Attar, Z., O'Boyle, D.J., Cody, F.W.J., 1998. Effects of site of delivery of an electrical cutaneous metronome on the magnitude of the synchronization error during human temporal tracking. J. Physiol. 509, 181–182. Brochard, R., Abecasis, D., Potter, D., Ragot, R., Drake, C., 2003. The “ticktock” of our internal clock: direct brain evidence of subjective accents in isochronous sequences. Psychol. Sci. 14 (4), 362–366. Buck, J., 1988. Synchronous rhythmic flashing of fireflies. II. Quart. Rev. Biol. 63, 265–289. Drake, C., 1998. Psychological processes involved in the temporal organization of complex auditory sequences: universal and acquired processes. Music Percept. 16, 11–26. Drake, C., Jones, M.R., Baruch, C., 2000a. The development of rhythmic attending in auditory sequences: attunement, referent period, focal attending. Cognition 77 (3), 251–288. Drake, C., Penel, A., Bigand, E., 2000b. Tapping in time with mechanically and expressively performed music. Music Percept. 18, 1–23. Fraisse, P., 1982. Rhythm and tempo. In: Deutsch, D. (Ed.), The Psychology of Music. Academic Press, New York, pp. 149–180. Grahn, J.A., Brett, M., 2007. Rhythm and beat perception in motor areas of the brain. J. Cogn. Neurosci. 19, 893–906. Jones, M.R., 1976. Time, our lost dimension: toward a new theory of perception, attention, and memory. Psychol. Rev. 83 (5), 323–355. Jones, M.R., Boltz, M., 1989. Dynamic attending and responses to time. Psychol. Rev. 96 (3), 459–491. Jongsma, M.L., Eichele, T., Quian Quiroga, R., Jenks, K.M., Desain, P., Honing, H., et al., 2005. Expectancy effects on omission evoked potentials in musicians and non-musicians. Psychophysiology 42 (2), 191–201. Keller, P.E., Repp, B.H., 2005. Staying offbeat: sensorimotor syncopation with structured and unstructured auditory sequences. Psychol. Res. 69 (4), 292–309.

64

BR A IN RE S EA RCH 1 2 23 ( 20 0 8 ) 5 9 –64

Large, E.W., Jones, M.R., 1999. The dynamics of attending: how people track time-varying events. Psychol. Rev. 106, 119–159. London, J., 2004. Hearing in Time: Psychological Aspects of Musical Meter. Oxford University Press, USA. Müller, K., Aschersleben, G., Schmitz, F., Schnitzler, A., Freund, H.-J., Prinz, W., 2008. Inter-versus intramodal integration in sensorimotor synchronization: a combined behavioral and magnetoencephalographic study. Exp. Brain Res. 185, 309–318. Patel, A.D., 2006. Musical rhythm, linguistic rhythm and human evolution. Music Percept. 24 (1), 99–104. Patel, A.D., Iversen, J.R., Chen, Y., Repp, B.H., 2005. The influence of metricality and modality on synchronization with a beat. Exp. Brain Res. 163 (2), 226–238. Phillips-Silver, J., Trainor, L.J., 2005. Feeling the beat: movement influences infant rhythm perception. Science 308 (5727), 1430. Phillips-Silver, J., Trainor, L.J., 2007. Hearing what the body feels: auditory encoding of rhythmic movement. Cognition 105 (3), 533–546. Povel, D.J., Okkerman, H., 1981. Accents in equitone sequences. Percept. Psychophys. 30 (6), 565–572.

Povel, D.J., Essens, P., 1985. Perception of temporal patterns. Music Percept. 2, 411–440. Potter, D., Abecasis, D., Fenwick, M., Brochard, R., in press. Perceiving rhythm where none exists: event-related potential (ERP) correlates of subjective accenting. Cortex. Repp, B.H., 2005. Sensorimotor synchronization: a review of the tapping literature. Psychonom. Bull. Rev. 12, 969–992. Snyder, J.S., Large, E.W., 2004. Tempo dependence of middle- and long-latency auditory responses: power and phase modulation of the EEG at multiple time-scales. Clin. Neurophysiol. 115 (8), 1885–1895. Trehub, S.E., Hannon, E.E., 2006. Infant music perception: domain-general or domain-specific mechanisms? Cognition 100 (1), 73–99. Vuust, P., Pallesen, K.J., Bailey, C., van Zuijen, T.L., Gjedde, A., Roepstorff, A., et al., 2005. To musicians, the message is in the meter pre-attentive neuronal responses to incongruent rhythm are left-lateralized in musicians. Neuroimage 24 (2), 560–564. Zanto, T.P., Snyder, J.S., Large, E.W., 2006. Neural correlates of rhythmic expectancy. Adv. Cogn. Psychol. 2 (2–3), 221–231.