1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54
HEARES6883_proof ■ 5 January 2015 ■ 1/10
Hearing Research xxx (2015) 1e10
Contents lists available at ScienceDirect
Hearing Research journal homepage: www.elsevier.com/locate/heares
Research paper
Cochlear implant users move in time to the beat of drum music Q16
Jessica Phillips-Silver a, b, *, Petri Toiviainen c, Nathalie Gosselin a, Christine Turgeon b, Franco Lepore b, Isabelle Peretz a, b a
International Laboratory for Brain, Music and Sound Research (BRAMS), Pavillon 1420 boul. Mont Royal, University of Montreal, Case Postale 6128, Station Centre-Ville, Montreal Qu ebec H3C 3J7, Canada b Department of Psychology, University of Montreal, C.P. 6128, Succursale Centre-Ville, Montr eal, Qu ebec H3C 3J7, Canada c €skyla €, Department of Music, P.O. Box 35, FI-40014, University of Finnish Centre of Excellence in Interdisciplinary Music Research, University of Jyva €skyla €, Finland Jyva
a r t i c l e i n f o
a b s t r a c t
Article history: Received 26 June 2014 Received in revised form 18 December 2014 Accepted 22 December 2014 Available online xxx
Cochlear implant users show a profile of residual, yet poorly understood, musical abilities. An ability that has received little to no attention in this population is entrainment to a musical beat. We show for the first time that a heterogeneous group of cochlear implant users is able to find the beat and move their bodies in time to Latin Merengue music, especially when the music is presented in unpitched drum tones. These findings not only reveal a hidden capacity for feeling musical rhythm through the body in the deaf and hearing impaired population, but illuminate promising avenues for designing early childhood musical training that can engage implanted children in social musical activities with benefits potentially extending to non-musical domains. © 2015 Published by Elsevier B.V.
1. Introduction Of great renown is the history of musicians with impaired access to sounddpercussionist Dame Evelyn Glennie, British opera singer Janine Roebuck, Finnish rap artist Signmark (Marko Vuoriheimo), Jazz vocalist Mandy Harvey, and composers Ludwig van Beethoven, William Boyce and Bedrich Smetanadwhose deafness or hearing loss, whether occurring in childhood or in later life, did not prohibit their musical achievement. It is less well known that the deaf and hearing impaired can dance, and yet a few examples have brought public attention to this fact. One of the most important known models is ballerina Nina Falaise, who was deaf from birth, and was rejected from London's Royal Ballet only after the discovery of her total deafness, but who went on to dance with Ballet Rambert. Other examples include Broadway dancer Jason McDole (who hears from one ear), members of the China Disabled People's Performing Art Troupe, and Washington, D.C.’s National Deaf Dance Theater.
* Corresponding author. Georgetown University Medical Center, New Research Building, WP-19, 3970 Reservoir Road, N.W., Washington D.C. 20057, USA. Tel.: þ1 202 486 8744. E-mail addresses:
[email protected] (J. Phillips-Silver), petri.toiviainen@ jyu.fi (P. Toiviainen),
[email protected] (N. Gosselin), turgeon.
[email protected] (C. Turgeon),
[email protected] (F. Lepore),
[email protected] (I. Peretz).
These individuals present an apparent paradox: they can dance to music that they have difficulty hearing, or even have not heard at all from birth. Yet, the ability to dance to music despite hearing loss has received limited scientific attention. Indications of the key potential mechanisms for perception of the musical beat and rhythm in the deaf and hearing impaired can be found in the cues taught by deaf experts in dance, and from some empirical evidence. Among these are visual, motor imagery, respiratory, tactile and vibrotactile cues, as well as residual auditory cues in some individuals, including for example, amplification of bass frequencies (e.g., Benari, 1995). There is evidence that the hearing impaired rely on vibrotactile cues when they are available, and that deaf populations show enhanced vibrotactile sensitivity (Lev€ anen and Hamdorf, 2001) and vibrotactile stimulation of auditory cortex (Auer et al., 2007; Caetano and Jousm€ aki, 2006; €nen et al., 1998). Deaf individuals show an advantage over Leva hearing individuals in synchronizing movement with visual timing cues to an isochronous pulse (Iversen et al., 2015). In combination these cues, when utilized, can result in a fine capacity for rhythmic timing and movement. According to expert deaf dancers and choreographers, the hearing impaired develop an internal sense of timing which can be sustained over the course of a piece, as with hearing musicians and dancers. Nevertheless, in the general population with hearing loss, who do not typically receive training in rhythm and movement, it is not known how these cues are utilized
http://dx.doi.org/10.1016/j.heares.2014.12.007 0378-5955/© 2015 Published by Elsevier B.V.
Please cite this article in press as: Phillips-Silver, J., et al., Cochlear implant users move in time to the beat of drum music, Hearing Research (2015), http://dx.doi.org/10.1016/j.heares.2014.12.007
55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65
HEARES6883_proof ■ 5 January 2015 ■ 2/10
2
J. Phillips-Silver et al. / Hearing Research xxx (2015) 1e10
or even whether the combination of them results in the ability for musical timing and coordinated rhythmic movementdthat is, musical entrainment. Thus the question of how deaf and hearing-impaired individuals dance is open to empirical investigation. The question extends to cochlear implant users, who likewise suffer from impaired hearing in music contexts, and who also tend to receive little training in music or musical movement. From the framework of entrainment, the first step towards understanding how those with hearing loss can dance is documenting with objective measures that they do indeed synchronize body movement in time to music with a regular beat, and comparing their performance to that of hearing individuals. The propensity to move in time to rhythmic percussive sounds or chants is manifest from an early age, as seen in children's spontaneous clapping games and nursery rhymes like “Pat-ACake”, “Miss Mary Mack” and “Ring Around the Rosie”, and in their impulsive body movement in response to music (Eerola et al., 2006; Zentner and Eerola, 2012). Infants and children produce bodily movements spontaneously to music, more so than to other complex auditory stimuli such as speech (Zentner and Eerola, 2010). The propensity to move to music is related to the positive affect observed in response to musical activity (Zentner and Eerola, 2010), which may explain why rhythmic movement and drumming in young children is strengthened in a social context and promotes prosocial behavior (Cirelli et al., 2014; Kirschner and Ilari, 2014; Kirschner and Tomasello, 2009, 2010). These communal movement activities share a common root ability: the perception of and synchronization with a musical beat. The musical beat refers to the perceived periodic pulse to which one entrains and synchronizes spontaneous movement, as in tapping or bobbing the head and body. The abilities that underlie musical beat detection, as well as perception of metrical structure, begin to develop early in life (Winkler et al., 2009; Hannon and Trehub, 2005; Hannon and Johnson, 2005). Infants' rhythm perception is influenced not only by the sounds they hear, but by the concurrent movement that they feel (Phillips-Silver and Trainor, 2005), which likely relies on multiple sensory inputs from auditory, proprioceptive and vestibular systems (PhillipsSilver and Trainor, 2008; Trainor et al., 2009). Among the cues that hearing infants use to represent the beat structure of an unaccented acoustic rhythm stimulus, there is evidence that movement cues, but not visual cues, are necessary for establishing a subjective representation of strong beats (PhillipsSilver and Trainor, 2005). This early bias towards movement cues in rhythm perception points to the question of how movement cues to musical rhythm might be preserved in individuals in which pitch perception is impaired. For example, the congenital amusic population is characterized by impoverished pitch perception and melody discrimination, yet they show an ability to extract the musical beat in order to synchronize dancelike movement to music, as measured by motion capture (Phillips-Silver et al., 2013). While their synchronization ability is not on par with that seen in a control group, they show potential for improvement given an appropriate stimulus, and practice (Phillips-Silver et al., 2013). Individuals with severe to profound hearing loss also suffer from impoverished pitch resolution, and furthermore the quality and experience of music is diminished when a CI is introduced. CI devices bypass the outer and the middle ear and directly stimulate the fibers of the auditory nerve, restoring some sensation of auditory perception. The primary goal of a cochlear implant is to permit speech perception in quiet every day listening environments, and for the majority of cochlear implant users this goal is achieved (though the ability can be compromised in noisy environments). Cochlear implants generally yield an improvement in speech
perception, the results of which have far exceeded the expectations of early investigators (Holt and Svirsky, 2008; Oh et al., 2003; Peterson et al., 2010). With modern multi-electrode cochlear implants, scores can reach 70e80% for sentence recognition in quiet (Osberger et al., 2000; Garnham et al., 2002). However, auditory discrimination is challenging for cochlear implant (CI) users, which can lead to difficulty with speech perception in noise (Spahr and Dorman, 2004) as well as for music recognition and appreciation (Gfeller et al., 2007; Kong et al., 2004). In hearing individuals, pitch information is faithfully transmitted to the auditory system, but pitch transmission with a CI, based on an electrical model, is much less precise. In fact, electrical stimulation is quite good at delivering temporal information, but presents impoverished spectral information and temporal fine-structure that would help to define pitch (Drennan and Rubinstein, 2008) and timbre, with the exception of temporal envelope, or log attack time (Kong et al., 2011). Cochlear implant users tend not to be as good as normal hearing listeners at identifying familiar real-world melodies (Gfeller et al., 2002a,b; Vongpaisal et al., 2006; Kong et al., 2004; Drennan and Rubinstein, 2008). Consequently, one of the more important challenges in this field has been to identify ways to improve music perception, and to make music more accessible and enjoyable, for cochlear implant users. As noted above, the electrical stimulation of a CI provides more faithful temporal than spectral information, which can fare better for the perception of rhythm than of pitch. Recent studies of rhythm perception in CI users indicate that rhythm perception abilities are relatively good, and sometimes as good as in hearing individuals (Drennan and Rubinstein, 2008; Gfeller et al., 1997; Kong et al., 2004; Looi, McDermott, McKay & Hickson, 2008; McDermott, 2004). Gfeller and colleagues (1997) used a battery called the adapted Primary Measures of Musical Audiation (PMMA), which contains a test of rhythmic pattern discrimination, and a task in which listeners must detect a short inter-pulse interval in a sixpulse auditory pattern. Adult CI users scored similarly to normal hearing participants for the rhythmic pattern discrimination, but were not as good as normal-hearing participants on the six-pulse task.1 Kong et al. (2004) found that tempo discrimination was near normal in cochlear implant users, and that complex rhythm discrimination was as good as in normal-hearing listeners. Schulz and Kerber (1994) tested CI users and a control group on a task of identification of musical rhythm patterns (such as a waltz or a tango), and a test of production in which listeners repeated by tapping 3- or 5-beat rhythm sequences. In both tasks, scores averaged 80% accuracy, but no statistical comparison was provided between the two groups of subjects. Despite indications that temporal processing is relatively spared in cochlear implant users, data is scarce on their perception of and synchronization to the beat in musicdthat is, their ability for musical entrainment. Yet great promise lies in the potential for rhythmic entrainment to capitalize on preserved abilities to perceive salient auditory beats (i.e., from bass and high frequencies, increased amplitude at metrical strong beats), and perhaps even more importantly, to tap into somatosensory, motor, vestibular and/or proprioceptive mechanisms. Such mechanisms may either serve to circumvent pitch-processing deficits, or even potentially be used to bolster them. To this end we examine beat finding and bodily synchronization in the context of typical dance music with rich orchestration and melodic pitch variation, but alsodcritical to our hypothesisdin the context of music which evades the problematic area of impoverished pitch discrimination but which is
1 The PMMA was designed for children, though data on young CI users do not seem to be available.
Please cite this article in press as: Phillips-Silver, J., et al., Cochlear implant users move in time to the beat of drum music, Hearing Research (2015), http://dx.doi.org/10.1016/j.heares.2014.12.007
66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65
HEARES6883_proof ■ 5 January 2015 ■ 3/10
J. Phillips-Silver et al. / Hearing Research xxx (2015) 1e10
3
nonetheless ecologically valid (and even evolutionarily predated): drum music. In the present study, we provide an assessment of the ability to synchronize full-body motion to music. We measured dance-like movement to a musical excerpt from the popular Latin dance music repertoire, following on previous studies by Phillips-Silver and colleagues (2011; 2013), as well as piano and drum renditions of the same piece in order to assess the role of interference from melodic pitch. We also tested performance in auditory and visual metronome conditions as a control for auditory-motor synchronization abilities.
progressive hearing loss did not become profound until later in life showed good speech perception abilities, due to their early exposure to oral language. All CI participants used oral language and lip reading as their primary mode of communication. All were native French speakers and had no prior experience in testing of musical abilities. Most (7/9) of our CI group reported to dance somewhat for pleasure (Table 1). Some CI participants had experience with music training (Table 1)dfor most of them, the music training occurred after the onset of a certain degree of hearing loss. None of our CI participants had learning disabilities, vestibular or motor deficits, or any other known neurological or medical conditions.
2. Participants
2.2. Control participants
2.1. Cochlear implant users
Nine control subjects matched for gender and age also participated in this study (Table 1). Controls were native French speakers living in the province of Quebec, with no known auditory or neurological impairments. All control subjects reported dancing for pleasure. All CI and control participants gave written informed consent in accordance with the Board of Ethics of the University of Montreal. Recruitment was made possible with the collaboration of the Center for Interdisciplinary Research in Rehabilitation of Greater Montreal, the RaymondeDewar Institute for Rehabilitation Specializing in Deafness and Communication, and the Le Bouclier Center for Rehabilitation of Physical Differences.
Nine cochlear implant users with progressive hearing loss participated in this study. All were native French speakers living in the province of Quebec. The clinical profile of each CI user is presented in Table 1. All had a unilateral cochlear implant which they used consistently for a minimum of two years (mean ¼ 8 years). The mean age for cochlear implant users was 43 years (SD ¼ 17). Hearing impairment started earlier in life for some cochlear implant users than for others (average length of auditory deprivation ¼ 30.7 years, SD ¼ 19.6), leading to different levels of hearing experience prior to their profound deafness (see Table 1). Most of the participants underwent an extensive period of progressive hearing loss leading to their deafness, and all of them suffered from severe to profound bilateral sensorineural hearing loss, with poor speech perception (even while using hearing aids), which were the criteria in the province of Quebec for their cochlear implantation surgery. For most participants, the surgery occurred in adulthood. Pure-tone detection thresholds in our CI population were assessed using an adaptive method at 250 Hz, 500 Hz, 1000 Hz, 2000 Hz, and 4000 Hz in free field at a distance of one meter. The group presented detection thresholds that were generally above 40 dB HL for all frequencies tested (Table 1), corresponding to what is generally reported in the literature. Speech recognition was evaluated with a list of 50 phonetically balanced French words (Lebel and Picard, 1995). This was an open-set test (i.e., no cues offered) in which pre-recorded monosyllabic words were presented in free field at a comfortable level of 70 dB SPL, without any visual cues. Participants were required to repeat what they heard; performance varied considerably. Some participants whose
3. Method The method employed was similar to that used by Phillips-Silver and colleagues (2013). Subjects bounced to three versions of the popular Merengue song Suavemente (by Elvis Crespo) chosen for its regular, binary beat structure. The song duration was 64 beats, and was played at two tempi: one faster (2.08 Hz, which corresponds to a period of 480 ms or 124 bpm), and one slower (1.93 Hz, which corresponds to a period of 520 ms or 116 bpm). These tempi were chosen because they are distinguishable but fall around the common preferred tempo of adult human motion and dance (Moelants, 2002). Tempi and beats of audio files were derived by the algorithm of the MIR Toolbox (Burger and Toiviainen, 2013; Lartillot and Toiviainen, 2007). Two “simpler” versions of the Merengue stimulus were created using Sibelius software to transcribe the rhythmic structure of the song in four voices. The first version was meant to have pitch variations, and so it was transcribed onto a piano score in four parts
Table 1 Profile of cochlear implant users. Subject
Sex
Age (years)
Auditory Age at Musical training Age at Number of MPTb thresholds Aided MPT Speech deafness implantation deprivation years using CI pre- implant (R,L) thresholds recognition (years, type of (years)a experience) (years) (years) with implantc (%)
S1 S2 S3 S4 S5 S6 S7 S8 S9 Matched controls (SD)
F F F F F F F F M 8F1M
30 36 63 45 22 61 68 34 55 47.6 (18.5)
3 Birth 16 3 8 7 Birth 12 8 e
a b c d
8 30 52 40 10 55 59 27 52 e
5 30 36 37 2 48 59 15 44 e
22 6 11 5 12 6 9 7 3 e
>120, >120 107, >120 110, 110 95,103 >120, >120 101, >120 103, 106 108, 103 68, 97
40 33 27 12 27 27 30 32 23 e
8 37 70 6 0 70 58 82 74 e
0 5 years piano 5 years singing 0 1 year xylophone 0.5 year guitar 0 0 0 4 (3.9)
Dancing for pleasured 1 3 2 0 3 1 1 2 0 1.8 (1)
Number of years from beginning of hearing loss until implantation. Mean Pure Tone (MPT) detection threshold. Thresholds averaged across 500, 1000 and 2000 Hz. 0 ¼ never, 1 ¼ rarely, 2 ¼ sometimes, 3 ¼ often, 4 ¼ very often.
Please cite this article in press as: Phillips-Silver, J., et al., Cochlear implant users move in time to the beat of drum music, Hearing Research (2015), http://dx.doi.org/10.1016/j.heares.2014.12.007
66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 Q13 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65
HEARES6883_proof ■ 5 January 2015 ■ 4/10
4
J. Phillips-Silver et al. / Hearing Research xxx (2015) 1e10
(Fig. 1a). This simplified piano version extracted prominent vocal and instrumental lines of the music and transcribed them to five staves, each played in piano timbre: the vocal lines on the 1st and 2nd staves, saxophone on the 3rd staff, alto horn on the 4th staff, and bass guitar on the 5th staff. The second version was created to reduce pitch variations, transcribed onto a score for drum set in four parts, notated on three staves (Fig. 1b). The first voice (1st staff) was played on snare drum, the second and third voices (2nd staff) were played in tenor drum timbres, and the fourth voice (3rd staff) was played on bass drum (stimuli can be heard online at www. brams.umontreal.ca/short/beat/cochlear). Such percussion timbres are called “unpitched” or “untuned”, because the sound of the instrument contains complex frequencies that do not have a welldefined fundamental frequency (Herrera et al., 2003). This is in contrast to the piano, which is a “pitched percussion” instrument whose sounds result in definite pitches. These two instrumental versions were recorded in Sibelius at two tempi: one faster (2 Hz), and one slower (1.86 Hz). All music stimuli were counterbalanced for condition (Merengue, piano and drum) and tempo (i.e., faster and slower) to minimize order effects, as follows: Merengue (fast), piano (slow), drum (fast), drum (slow), piano (fast), Merengue (slow). In two subsequent non-music control conditions, subjects followed either an auditory or a visual metronome cue at approximately the same tempi (faster: 2.07 Hz, and slower: 1.86 Hz) as the music, as follows: auditory metronome (fast), auditory metronome (slow), visual metronome (fast), visual metronome (slow). The auditory metronome control, provided by an electronic online metronome, ensured that subjects could produce a bouncing motion to the tempo of a single isochronous beat. In the visual metronome condition, the experimenter bounced to the sound of the auditory metronome delivered over headphones, and subjects observed and followed either with the implant turned off, or in the case of controls, while wearing noise-cancellation headphones to eliminate any ambient sound. The entire duration of testing was approximately 30 minutes. Subjects were tested in a large sound attenuated studio, with the experimenter present but facing away from the subject (except in the visual metronome condition). The participants listened to the auditory stimuli presented in free-field from Genelec speakers (at approximately 70 dB, though this was adjusted slightly on an individual basis depending on each subject's comfort level). In all
conditions, the subject wore the motion capture device described below. The experimenter wore a motion capture device in the visual metronome condition so that her movement could be measured as well. 4. Results Bouncing motion was captured with the accelerometer contained in the remote control of the Nintendo Wii, which was strapped to the trunk of the body. This device measured acceleration of body movement (bouncing) with a temporal resolution of 100 frames per second (10 ms), from which the beat-by-beat period of the vertical movement was computed. The zero-crossing of the vertical acceleration marked the bounces (Toiviainen, Luck & Thompson, 2010). We calculated the number of bounces for each subject on each 64-beat auditory stimulus. On average, the CI subjects produced 66 bounces for the Merengue, 64 for the piano, 66 for the drum, 62 for the auditory metronome, and 63 for the visual metronome. The matched controls produced on average 64 bounces for the Merengue, 63 for the piano, 63 for the drum, 60 for the auditory metronome, and 62 for the visual. 5. Measuring synchronization 5.1. Period-locking: proportion of synchronized power We assessed synchronization in dancelike movement by measuring the tempo (rate) and phase of movements with respect to the musical beat. In accordance with a prior study (Phillips-Silver et al., 2011), data were submitted to a Fourier analysis to measure the overall proportion of power at the musical beat period, as well as at related frequencies (i.e., 0.5 and 2 times the musical beat level), within the range of 0e5 Hz. Fig. 2 shows the averaged power spectra for CI subjects and matched controls at two tempi in five conditions: Merengue, piano, drum, auditory metronome, and visual metronome. For each individual subject we calculated the root-mean-square amplitude of acceleration for all stimuli. The CI subjects showed a slightly lower amplitude of acceleration than controls, but differences in amplitude between groups were not significant. The CI subjects showed notable power at the one-beat level, and peaks at the two-beat level, for the metronomedespecially the
Fig. 1. Music notation for the a) piano version and b) drum version of the merengue stimulus.
Please cite this article in press as: Phillips-Silver, J., et al., Cochlear implant users move in time to the beat of drum music, Hearing Research (2015), http://dx.doi.org/10.1016/j.heares.2014.12.007
66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65
HEARES6883_proof ■ 5 January 2015 ■ 5/10
J. Phillips-Silver et al. / Hearing Research xxx (2015) 1e10
5
Fig. 2. Power spectra for bouncing performance of CI subjects (red) and matched controls (black) as a function of stimulus. Vertical lines (blue) represent stimulus components at the half-beat, beat (quarter-note level), and twice-beat frequencies (in Hz, on the x-axis). The unit for acceleration (y-axis) is arbitrary. (For interpretation of the references to colour in this figure legend, the reader is referred to the web version of this article.)
visual but also the auditorydand for the drum stimulus (Fig. 2). In contrast, CIs did not show notable power at the musical beat period or related frequencies on any trial of the Merengue music or the piano music. Like CI subjects, the matched control subjects showed power at the one-beat level of the metronome (auditory and visual) and drum conditions, though with smaller peaks at the two-beat level (Fig. 2). Unlike CIs, however, the matched controls showed notable power at the beat frequency of the Merengue music and the piano condition, and peaks at the two-beat level. Thus, the bouncing movement of controls revealed synchronized power for the Merengue, piano and drum music, while the bouncing of CIs only revealed synchronized power for the drum music. For subsequent analyses, the strength of period-locking was operationalized as the proportion of power in the Fourier spectra at the 0.5, 1, and 2 beat levels, with a 5% tolerance. Hereafter this is referred to as proportion of synchronized power. A Friedman test was performed on the proportion of synchronized power, with hearing status as a between-subject factor, and tempo and stimulus type as within-subject factors. A main effect was found for hearing status (c2(1) ¼ 34.78, p < 0.0001), with the controls showing higher proportion of synchronized power (i.e., better synchronization) overall, and for stimulus type (c2(4) ¼ 39.81, p < 0.0001). No main effect for tempo was found, so all subsequent analyses are averaged across faster and slower tempi.
The following post hoc comparisons were performed separately using Friedman tests, and corrections for multiple comparisons were performed using the BenjaminieHochberg method with a false discovery rate of 0.05. First, post hoc comparisons were performed between participant groups for each stimulus type separately using Friedman tests. The control group showed superior period-locking for all stimulus types, with significant differences only in the Merengue and auditory metronome conditions (Fig. 3). Next, post hoc comparisons between stimulus types were carried out for each participant group separately (Fig. 4). For the CI group, median values of synchronized power for the Merengue, piano, drum, auditory metronome, and visual metronome stimuli were 0.660, 0.566, 0.692, 0.788, and 0.866, respectively. Significant differences were found between the Merengue and auditory metronome (c2(1) ¼ 11.27, adjusted p ¼ 0.002), Merengue and visual metronome (c2(1) ¼ 11.27, adjusted p ¼ 0.002), piano and auditory metronome (c2(1) ¼ 15.00, adjusted p ¼ 0.001), piano and visual metronome (c2(1) ¼ 11.27, adjusted p ¼ 0.002), and drum and visual metronome (c2(1) ¼ 5.40, adjusted p ¼ 0.040). For the control group, median values of synchronized power for the Merengue, piano, drum, auditory metronome, and visual metronome stimuli were 0.830, 0.858, 0.819, 0.918, and 0.890, respectively. Significant differences were found between Merengue and auditory metronome (c2(1) ¼ 11.27, adjusted p ¼ 0.002), and piano and auditory metronome (c2(1) ¼ 13.07, adjusted p ¼ 0.003).
Please cite this article in press as: Phillips-Silver, J., et al., Cochlear implant users move in time to the beat of drum music, Hearing Research (2015), http://dx.doi.org/10.1016/j.heares.2014.12.007
66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65
HEARES6883_proof ■ 5 January 2015 ■ 6/10
6
J. Phillips-Silver et al. / Hearing Research xxx (2015) 1e10
Fig. 3. Boxplots of the proportion of significant power (i.e., period-locking) at the beat and twice-beat frequency for cochlear implant (CI) and control (CTL) groups, in each stimulus condition. The significance values are corrected for multiple comparisons using a false discovery rate of 0.05.
5.2. Phase-locking: circular variances We next measured whether the body movements were phaselocked to the auditory beat in each stimulus. As in Phillips-Silver et al. (2011), the vertical component of acceleration of movement was bandpass filtered using a zero-phase FFT filter and Gaussian frequency response with a center frequency equal to the beat frequency of each stimulus, and a bandwidth of 20% of the respective center frequency. A Hilbert transform was used to estimate the instantaneous phase of the filtered signal, and subsequently the instantaneous phase was sampled at time points that corresponded to every beat in the stimulus within the interval of 5e30 s from the start of the stimulus. This resulted in discrete phase values for each subject, which were achieved by sampling the instantaneous phase values obtained from the Hilbert transform at the points of the musical beats. The circular variance (a measure of phase-locking, with higher circular variance indicating worse performance) was then calculated from these phase values. A Friedman test was performed on the circular variances with hearing status as a between-subject factor, and tempo and stimulus type as within-subject factors. Main effects were found for hearing status (c2(1) ¼ 48.4, p < 0.0001), with the controls showing lower circular variance (i.e., better phase-locking) overall, and for stimulus type (c2(1) ¼ 16.9, p ¼ 0.002). No main effect was found for tempo. The following post hoc comparisons were performed separately using Friedman tests, and corrections for multiple comparisons were performed using the BenjaminieHochberg method with a false discovery rate of 0.05. First, post hoc comparisons were performed between participant groups for each stimulus type separately (Fig. 5). The control group showed superior phase-locking for all stimulus types, with significant differences in the Merengue (c2(1) ¼ 13.07, adjusted p ¼ 0.001), piano (c2(1) ¼ 9.60, adjusted p ¼ 0.002), drum (c2(1) ¼ 11.27, adjusted p ¼ 0.002), and auditory metronome (c2(1) ¼ 9.60, adjusted p ¼ 0.002), but not the visual metronome condition (c2(1) ¼ 1.07, adjusted p ¼ 0.3017).
Fig. 4. Boxplots of the proportion of synchronized power per stimulus type, for cochlear implant (CI) and control (CTL) groups. Within each group the pairwise significant differences between stimulus conditions (corrected for multiple comparisons using a false discovery rate of 0.05) are indicated by brackets.
Next, post hoc comparisons between stimulus types were carried out for each participant group separately (Fig. 6). For the CI group, the median circular variances for the Merengue, piano, drum, auditory metronome, and visual metronome stimulus types were 0.369, 0.512, 0.179, 0.140, and 0.134, respectively. Significant differences were found between Merengue and drum (c2(1) ¼ 5.40, adjusted p ¼ 0.040), Merengue and auditory metronome (c2(1) ¼ 11.27, adjusted p ¼ 0.008), Merengue and visual metronome (c2(1) ¼ 8.07, adjusted p ¼ 0.011), piano and auditory metronome (c2(1) ¼ 8.07, adjusted p ¼ 0.011), and piano and visual metronome (c2(1) ¼ 8.07, adjusted p ¼ 0.011). While the difference between drum and piano did not reach significance (c2(1) ¼ 4.26, adjusted p ¼ 0.0648), the result that phase-locking to the drum
Please cite this article in press as: Phillips-Silver, J., et al., Cochlear implant users move in time to the beat of drum music, Hearing Research (2015), http://dx.doi.org/10.1016/j.heares.2014.12.007
66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65
HEARES6883_proof ■ 5 January 2015 ■ 7/10
J. Phillips-Silver et al. / Hearing Research xxx (2015) 1e10
7
Fig. 6. Boxplots of the circular variances per stimulus type, for cochlear implant (CI) and control (CTL) groups. Within each group the pairwise significant differences between stimulus conditions (corrected for multiple comparisons using a false discovery rate of 0.05) are indicated by brackets.
that CI subjects' synchronization to music is inferior to that of controls, it does not answer the question: do CI subjects show synchronization ability? To address this question, we move from comparing the synchronization performance between the two groups, and between stimulus conditions, to comparing each group's performance against chance. If the CI subjects have the ability for entrainment, then we should see that their bouncing movements show phase-locking to the musical tempo. In contrast, if they do not have this ability, then their movements should not be related to the musical tempo. To this end, we tested the degree of phase-locking against chance level. 5.3. Measuring CI subjects' performance against chance
Fig. 5. Boxplots of the amount of circular variance (i.e., phase-locking) for cochlear implant (CI) and control (CTL) groups, in each stimulus condition. The significance values are corrected for multiple comparisons using a false discovery rate of 0.05.
music was significantly better than to the Merengue musicdand not significantly different from either the auditory or visual metronomesdis notable and most illustrative of the predictions of the present study. For the control subjects, the median circular variances for the Merengue, piano, drum, auditory metronome, and visual metronome stimulus types were 0.094, 0.095, 0.085, 0.079, and 0.106, respectively. No significant differences were found between stimulus types. Calculation of variance of the circular variances showed that the CI phase-locking data were more variable than those of the matched controls. Such a pattern of increased variability has been observed in tests of temporal rhythm perception and production in this population (Gfeller et al., 2008). While the data so far confirm
We used the phase values obtained from the Hilbert transform on the full motion capture recordings (5e30s) to perform a nondirectional Rayleigh test (Z-statistic) against a uniform distribution for each individual recording. This measure required a high degree of internal consistency of phase angle within each individual subject. The majority of subjects in both the CI and control groups did show significant phase-locking per condition (see Table 2). The best phase-locking by CI subjects was in the visual metronome condition, followed by the auditory metronome and then the drum music condition. Most importantly, as indicated by the Friedman tests (Fig. 6), their phase-locking in the drum music condition was significantly greater than in the Merengue condition, but not significantly different from either the auditory or the visual metronome condition. In sum, this means that CI subjects phaselocked to the drum music stimulus as well as they did to the metronome stimuli. At a minimum we can conclude from these data that the tempo of CI subjects' bouncing was due to the beat period of the drum music stimuli, and not due to chance. 6. Discussion The present findings show for the first time that a diverse group of cochlear implant users are able to synchronize dance-like body
Please cite this article in press as: Phillips-Silver, J., et al., Cochlear implant users move in time to the beat of drum music, Hearing Research (2015), http://dx.doi.org/10.1016/j.heares.2014.12.007
66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130
1 Q14 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65
HEARES6883_proof ■ 5 January 2015 ■ 8/10
8
J. Phillips-Silver et al. / Hearing Research xxx (2015) 1e10
Table 2 Results of CI and Control data tested against chance levels. Group
N
Tempo
Merengue
Piano
Drum
Auditory metronome
Visual metronome
CI
9
Control
9
Faster Slower Faster Slower
5 8 9 9
6 5 9 9
7 8 9 9
9 9 9 9
9 9 7 9
Notes: Calculated with Rayleigh tests. Shows number of subjects that are significantly synchronized (p ¼ 0.05) in each condition. ‘Faster’ tempo refers to 2.08 Hz for Merengue, 2 Hz for piano and drum, and 2.07 Hz for visual and auditory metronome stimuli. ‘Slower’ tempo refers to 1.93 Hz for Merengue, and 1.86 Hz for piano and drum, auditory and visual metronome stimuli.
movement to drum musicdeven as well as they entrain to visual and auditory metronomes. In comparison with a group of matched controls, however, the synchronization performance of CI users is worse across all conditions. In sum, CI users are able to move in time to the beat of music, if not as well as normal hearing controls. In the context of music, the drum condition elicited the best performance of the CI group. Since the drum version of the music stimulus was less complex than the Merengue music stimulus (i.e., it was a simplified transcription of the piece into just a few parts), we used the piano version as a comparison that is equivalent to it in rhythmic complexity. The fact that the phase-locking performance of the CI group increased for the drum music and decreased for the piano music suggests that the advantage of the drum music was not merely the reduction in complexity, but rather might be due in part to the absence of pitch variations. This is consistent with our prediction, based on previous evidence that pitch variations would interfere with beat finding and synchronization. Other features could contribute to the difference in entrainment to drum and pitched music conditions in CI users as well, such as temporal envelope, spectral envelope, CI users' perception of attack time, and auditory stream segregation. These features will need to be studied systematically in future research, in particular with the increasing understanding of the contribution of the technical parameters of the cochlear implant device on music perception, including especially temporal fine structure (e.g., Smith et al., 2002; Limb and Roy, 2013). Controls' performance was consistently high, and their phaselocking was not significantly different between music and metronome conditions. This is in contrast to previous findings in the general population that synchronization can be easier in the context of an auditory metronome than in the context of this music (Phillips-Silver et al., 2011). More meaningful comparisons will come from studies that test bodily entrainment to real music stimuli reflecting a larger range of rhythmic complexity and demands on movement skill. This study is the first to document the ability to find the beat and synchronize dance-like movement to drum music in cochlear implant users. The finding that our cochlear implant participants performed more poorly with music containing melodic pitch variations (and poorest in the music rendered by piano only, without any other accents to strong beats), is consistent with previous research showing that CI users are less good than hearing controls at music recognition and perception when pitch interferes with rhythmic cue perception (e.g., Kong et al., 2004). Yet the results illuminate several reasons to further pursue the avenue of dancing to percussive music in this population, not only in the context of empirical research, but also in the context of music training and rehabilitation, as well as everyday life experiences. First, while the performance of this group of cochlear implant users did not reach the level of their hearing controls, they did
show significant beat finding and synchronization; in other words, they can feel the beat, and entrain to it. This is a point not to be underestimated when considering the role of music in quality of life for cochlear implant users, as some have suggested that the second major complaint among CI users after speech perception is music perception. In a 2000 study (Gfeller et al., 2000), 83% of adult CI users reported less enjoyment of music listening, possibly because familiar music no longer sounded the same. In a recent study (Looi et al., 2007), adult CI users reported a quality of listening experience similar to those using hearing aids, which is unsatisfactory especially when considering that the air conduction of hearing aids can result in better perception of low frequencies. This might explain why some individuals report their most favorable music listening experiences when using both a CI and a hearing aid, or a hybrid implant (Gfeller et al., 2007). Yet there seems to be an untapped potential for individuals in this very population to respond bodily to the beat of dance music, particularly if given exposure to appropriate music formsdperhaps especially, drum music. Future studies should aim to provide converging neurophysiological evidence of embodied beat perception and entrainment in CI users, as seen in the neuronal entrainment of normal hearing subjects (Nozaradan et al., 2011a,b; Nozaradan, Peretz & Q1 Mouraux, 2012). In a recent review, Gfeller's team (2010) noted that CIs tend to be deficient in transmitting certain structural features of musicdnamely, pitch, melody and timbre, but argued that music perception and appraisal can improve with experience. Rhythm and beat are conspicuously absent from this perspective, and yet they seem to be among the most important structural features of music, especially considering that the most popular musical activities in addition to singing that do not require any formal music training or expertise, and furthermore that do not rely on fine pitch discrimination, are moving to the beat of the music and the crowd in dance clubs, and participating in informal drum and dance circles. Driscoll and colleagues (2009) suggested that training programs that require identification rather than just discrimination have more potential to engage the listener and improve performance over time. In the research domain, understanding the effects of cochlear implants in musical behavior will benefit from studies focusing on those features of music that contribute to the efficacy and salience of rhythm and beat for CI users as well as for normal hearing population, including gaps within impulse trains, envelope, and attack and decay characteristics (Limb and Roy, 2013). In addition, studies of more typical CI populationsdi.e., those who are congenitally deaf and implanted early in lifedwill help refine the parameters for music training programs specific to their needs. Since for normal hearing subjects even short term music training can improve rhythmic synchronization and induce cortical plasticity, with benefits not only for musical skills but for speech perception, spoken language, and reading as well (e.g., Bailey and Penhune, 2013; Kokal et al., 2011; Lappe et al., 2011; Moritz et al., 2013; Strait et al., 2011, 2013; Tierney and Kraus, 2015; Tierney et al., 2013), we speculate that for individuals with cochlear implants, musical exposure and training should support improvement in similar areas, especially when tailored to their needs. Of particular interest to this population, music training benefits the perception of speech in noise (Parbery-Clark et al., 2009a; Strait et al., 2011; Zendel and Alain, 2012). Thus the prospect of using percussion music to enhance this effect has potential benefits for their auditory longevity, and seems promising in light of the current study. For this population, music training that centers on the musical scale (such as traditional piano lessons) might not be best suited to optimal beat perception and synchronization. Some efforts to train musical skills in early childhood very soon after implantation
Please cite this article in press as: Phillips-Silver, J., et al., Cochlear implant users move in time to the beat of drum music, Hearing Research (2015), http://dx.doi.org/10.1016/j.heares.2014.12.007
66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65
HEARES6883_proof ■ 5 January 2015 ■ 9/10
J. Phillips-Silver et al. / Hearing Research xxx (2015) 1e10
Q2
Q15
suggest promising results, most notably in the domain of rhythm, which should be replicated in a controlled study (Kosaner et al., 2012). Extending from the present study, a possibility to consider for music in early childhood and in music education is the inclusion of some exposure to or training on unpitched percussive instruments, such as in African and Afro-latino drumming traditions, or the practice of dance to percussion-based music (e.g., stepping, drill teams, and some forms of hip hop, tap dance, belly dance, and flamenco). Such musical experiences could enable cochlear implant users to practice the skill of beat finding without regard to pitch, and to hone the naturally associated skill of synchronizing body movement with the music and with others. This can certainly be done in combination with pitch training, which has been shown to benefit cochlear implant users in their music listening, with benefits possibly even extending to language perception (Kosaner et al., 2012; Gfeller et al., 2000, 2003; Moritz et al., 2013). We speculate that pitch training should also lead to improved emotion recognition in music, which appears to some degree in children with CIs (Volkova et al., 2013), and even that musical training could lead to improved decoding of emotion in vocal expressions, as seen at a subcortical level in the hearing population (Strait et al., 2009). We can not however put too fine a point on the fact that many musical traditions do not center around melodic and harmonic structure, yet offer not only a rich, multimodal music experience, but an even more highly sophisticated rhythmic structure and bodily skill set. Such skills include complex timing of movement, sensitivity to nuances of communication and synchronization such as through gesture, breath and joint action in instrument playing, and haptic perception, leading, following and improvisation in dance. Musical practices that center around groove (Janata et al., 2012) celebrate the tight link between musical movement and positive affect. Rhythm and beat alonedperceived through touch, vision, and any residual hearingdcan provide a sense of individual musical pleasure, communal bonding, and a high degree of musical skill. In the words of Dame Evelyn Glennie, hearing is “basically a specialized form of touch” (Glennie, 1993). Pedagogical approaches that are amenable to a focus on rhythm and the body include the highly regarded techniques of Emile Jacques-Dalcroze (called Eurythmics) and of Carl Orff (called Schulwerk; Salmon, 2010). Such techniques are used in early childhooddwhich is ideal for the large majority of congenitally deaf individuals being implanted early in lifedand as a part of music training and appreciation throughout the lifespan. Uncited references Brown, 2003; Darwin, 1871; Freeman, 2002; Gfeller and Lansing, 1991; Hilbok, 1981; Hottendorf, 1989; Leal et al., 2003; Madison, 2006; Madison et al., 2011; McCord and Fitzgerald, 2006; McNeill, 1995; Miller, 2000; Nketia, 1974; Patel, 2011; PhillipsSilver and Trainor, 2007; Tranchant et al., 2013; Wallin, 1991; Wiltermuth and Heath, 2009. Acknowledgments
We wish to acknowledge the participants with cochlear implants for their time and dedication to the project. The work was du Que bec Q3 supported by grants from Fonds de Recherche en Sante Q4,5 to JPS, from the Academy of Finland to PT, from the Institute of Q6 Cognitive Science Research to FL, from the Canada Institute of Q7 Health Research to FL and IP, and from the Natural Sciences and Q8 Engineering Research Council of Canada, and a Canada Research Chair to IP.
9
References Auer, E.T., Bernstein, L.E., Sungkarat, W., Singh, M., 2007. Vibrotactile stimulation of the auditory cortices in deaf versus hearing adults. Neuroreport 18 (7), 645e648. Bailey, J.A., Penhune, V.B., 2013. The relationship between the age of onset of musical training and rhythm synchronization performance: validation of sensitive period effects. Front. Neurosci. 7 http://dx.doi.org/10.3389/ fnins.2013.00227. Benari, Naomi, 1995. Inner Rhythm e Dance Training for the Deaf. Gordon and Breach Publishing Group. Brown, S., 2003. Biomusicology, and three biological paradoxes about music. Bull. Psychol. Arts 4, 15e17. Burger, B., Toiviainen, P., 2013. MoCap toolbox e a Matlab toolbox for computational analysis of movement data. In: Bresin, R. (Ed.), Proceedings of the 10th Sound and Music Computing Conference. KTH Royal Institute of Technology, Stockholm, Sweden. Caetano, G., Jousm€ aki, V., 2006. Evidence of vibrotactile input to human auditory cortex. Neuroimage 29 (1), 15e28. Cirelli, L.K., Einarson, K.M., Trainor, L.J., 2014. Interpersonal synchrony increases prosocial behavior in infants. Dev. Sci. http://dx.doi.org/10.1111/desc.12193. Darwin, C., 1871. The Descent of Man and Selection in Relation to Sex. John Murray, London. Drennan, W.R., Rubinstein, J.T., 2008. Music perception in cochlear implant users and its relationship with psychophysical capabilities. J. Rehabil. Res. Dev. 45, 779e789. Eerola, T., Luck, G., Toiviainen, P., 2006. An investigation of preschoolers' corporeal synchronization with music. In: Baroni, M., Addessi, A.R., Caterina, R., Costa, M. (Eds.), Proceedings of the 9th International Conference on Music Perception and Cognition. The Society for Music Perception and Cognition and European Society for the Cognitive Sciences of Music, Bologna, pp. 472e476. Freeman, Marcia, 2002. The language of dance. Odyssey. Spring/Summer 2002. Garnham, C., O'Driscoll, M., Ramsden, R., Saeed, S., 2002. Speech understanding in noise with a Med-El COMBI 40þ cochlear implant using reduced channels sets. Ear Hear. 23, 540e552. Gfeller, K., Lansing, C.R., 1991. Melodic, rhythmic, and timbral perception of adult cochlear implant users. J. Speech Hear. Res. 34, 916e920. Gfeller, K., Woodworth, G., Robin, D.A., Witt, S., Knutson, J.F., 1997. Perception of rhythmic and sequential pitch pattern by normally hearing and adult cochlear implant users. Ear Hear 18, 252e260. Gfeller, K., Turner, C., Mehr, M., et al., 2002a. Recognition of familiar melodies by adult cochlear implant recipients and normal-hearing adults. Cochlear Implant Int. 3, 29e53. Gfeller, K., Witt, S., Woodworth, G., et al., 2002b. Effects of frequency, instrumental family, and cochlear implant type on timbre recognition and appraisal. Ann. Otol. Rhinol. Laryngol. 111, 349e356. Gfeller, K., Christ, A., Knutson, J., et al., 2003. The effects of familiarity and complexity on appraisal of complex songs by cochlear implant recipients and normal hearing adults. J. Music Ther. 40, 78e112. Gfeller, K., Turner, C., Oleson, J., Zhang, X., Gantz, B., Froman, R., et al., 2007. Accuracy of cochlear implant recipients on pitch perception, melody recognition, and speech reception in noise. Ear Hear 28, 412e423. Gfeller, K., Oleson, J., Knutson, J.F., et al., 2008. Multivariate predictors of music perception and appraisal by adult cochlear implant users. J. Am. Acad. Audiol. 19 (2), 120e134. Glennie, E., 1993. Hearing Essay. http://www.evelyn.co.uk/literature.html. Hannon, E.E., Johnson, S.P., 2005. Infants use meter to categorize rhythms and melodies: Implications for musical structure learning. Cogn. Psychol. 50, 354e377. Hannon, E.E., Trehub, S.E., 2005. Tuning in to musical rhythms: Infants learn more readily than adults. Proc. Natl. Acad. Sci. U. S. A. 102 (35), 12639e12643. Hilbok, Bruce, 1981. Silent Dancer. Simon & Schuster, New York. Holt, R.F., Svirsky, M.S., 2008. An exploratory look at pediatric cochlear implantation: is earlier always best? Ear Hear 29, 492e511. Hottendorf, D., 1989. Mainstreaming deaf and hearing children in dance classes. J. Phys. Educ. Recreat. Danc. November/December, 1989. Iversen, J.R., Patel, A.D., Nicodemus, B., Emmorey, K., 2015. Synchronization to auditory and visual rhythms in hearing and deaf individuals. Cognition 134, 232e244. Janata, P., Tomic, S.T., Haberman, J., 2012. Sensorimotor coupling in music and the psychology of the groove. J. Exp. Psychol. General 141 (1), 54e75. Kirschner, S., Ilari, B., 2014. Joint drumming in Brazilian and German preschool children: cultural differences in rhythmic entrainment, but no prosocial effects. J. cross-cult. Psychol. 45, 137e166. Kirschner, S., Tomasello, M., 2009. Joint drumming: social context facilitates synchronization in preschool children. J. Exp. Child Psychol. 102, 299e314. Kirschner, S., Tomasello, M., 2010. Joint music making promotes prosocial behavior in 4-year-old children. Evol. Hum. Behav. 31 (5), 354e364. Kokal, I., Engel, A., Kirschner, S., Keysers, C., 2011. Synchronized drumming enhances activity in the caudate and facilitates prosocial commitment- if the rhythm comes easily. PLoS ONE 6 (11), e27272. http://dx.doi.org/10.1371/ journal.pone.0027272. Kong, Y.-Y., Cruz, R., Jones, J.A., Zeng, F.-G., 2004. Music perception with temporal cues in acoustic and electric hearing. Ear Hear 25, 173e185.
Please cite this article in press as: Phillips-Silver, J., et al., Cochlear implant users move in time to the beat of drum music, Hearing Research (2015), http://dx.doi.org/10.1016/j.heares.2014.12.007
66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 Q9 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 Q10 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 Q11 34 35 Q12 36 37 38 39 40 41 42 43 44 45 46 47
HEARES6883_proof ■ 5 January 2015 ■ 10/10
10
J. Phillips-Silver et al. / Hearing Research xxx (2015) 1e10
Kong, Y.-Y., Mullangi, A., Marozeau, J., Epstein, M., 2011. Temporal and spectral cues for musical timbre perception in electrical hearing. J. Speech, Lang. Hear. Res. 54 (3), 981e994. Kosaner, J., Kilinc, A., Deniz, M., 2012. Developing a music programme for preschool children with cochlear implants. Cochlear Implant. Int. 1e11. Lappe, C., Trainor, L.J., Herholz, S., Pantev, C., 2011. Cortical plasticity induced by short-term multimodal musical rhythm training. PLoS ONE 6 (6), e21493. http://dx.doi.org/10.1371/journal.pone.0021493. Lartillot, O., Toiviainen, P., 2007. MIR in matlab (II): a toolbox for musical feature extraction from audio. In: Paper Presented at the 8th International Conference on Music Information Retrieval, Vienna. Leal, M.C., Shin, Y.J., Laborde, M.-L., et al., 2003. Music perception in adult cochlear implant recipients. Acta Otolaryngol. 123, 826e835. €nen, S., Hamdorf, D., 2001. Feeling vibrations: enhanced tactile sensitivity in Leva congenitally deaf humans. Neurosci. Lett. 301 (1), 75e77. €nen, S., Jousma €ki, V., Hari, R., 1998. Vibration-induced auditory cortex activaLeva tion in a congenitally deaf adult. Curr. Biol. 8, 869e872. Limb, C.J., Roy, A.T., 2013. Technological, biological and acoustical constraints to music perception in cochlear implant users. Hear. Res. 308, 13e26. Madison, G., 2006. Experiencing groove induced by music: consistency and phenomenology. Music Percept. 24 (2), 201e208. n, F., Ho €rnstro €m, K., 2011. Modeling the tendency for Madison, G., Gouyon, F., Ulle music to induce movement in humans: first correlations with low-level audio descriptors across music genres. J. Exp. Psychol. Hum. Percept. Perform. 37 (5), 1578e1594. McCord, K., Fitzgerald, M., Mar., 2006. Music Educ. J. 92 (4), 46e52. McDermott, H.J., 2004. Music perception with cochlear implants: a review. Trends Amplif. 8, 49e82. McNeill, W.H., 1995. Keeping Together in Time. Dance and Drill in Human History. Harvard University Press, Cambridge, MA. Miller, G.F., 2000. Evolution of music through sexual selection. In: Wallin, N.L., Merker, B., Brown, S. (Eds.), The Origins of Music. The MIT Press, Cambridge, Mass, pp. 329e360. Moelants, D., 2002. Preferred tempo reconsidered. In: Stevens, C., Burnham, D., McPherson, G., Schubert, E., Renwick, J. (Eds.), Proceedings of the 7th International Conference on Music Perception and Cognition. Causal Productions, Adelaide, pp. 580e583. Moritz, C., Yampolsky, S., Papadelis, G., Thompson, J., Wolf, M., 2013. Links between early rhythm skills, music training, and phonological awareness. Read. Writ. 26, 739e769. Nketia, J.H.K., 1974. The Music of Africa. W. W. Norton & Co, New York. Nozaradan, S., Peretz, I., Missal, M., Mouraux, A., 2011a. Tagging the neuronal entrainment to beat and meter. J. Neurosci. 31 (28), 10234e10240. Nozaradan, S., Peretz, I., Mouraux, A., 2011b. Selective neuronal entrainment to the beat and meter embedded in a musical rhythm. J. Neurosci. 32 (49), 17572e17581. Oh, S.H., Kim, C.S., Kang, E.J., Lee, D.S., Lee, H.J., Chang, S.O., et al., 2003. Speech perception after cochlear implantation over 4-year time period. Acta Otolaryngol. 123, 148e153. Osberger, M.J., Fisher, L., Kalberer, A., 2000. Speech perception results in adults implanted with the CLARION multi-strategy cochlear implants. Adv. Otorhinolaryngol. 57, 421e424. Patel, A.D., 2011. Why would musical training benefit the neural encoding of speech? The OPERA hypothesis. Front. Psychol. 2, 142. http://dx.doi.org/ 10.3389/fpsyg.2011.00142. Peterson, N.R., Pisoni, D.B., Miyamoto, R.T., 2010. Cochlear implant and spoken language processing abilities: review and assessment of the literature. Restor. Neurol. Neurosci. 28, 237e250.
Phillips-Silver, J., Trainor, L.J., 2005. Feeling the beat: movement influences infants' rhythm perception. Science 308, 1430. Phillips-Silver, J., Trainor, L.J., 2007. Hearing what the body feels: auditory encoding of rhythmic movement. Cognition 105, 533e546. Phillips-Silver, J., Trainor, L.J., 2008. Vestibular influence on auditory metrical interpretation. Brain Cognit. 67 (1), 94e102. , O., Nozaradan, S., Palmer, C., Phillips-Silver, J., Toiviainen, P., Gosselin, N., Piche Peretz, I., 2011. Born to dance but beat deaf: a new form of congenital amusia. Neuropsychologia 49, 961e969. Phillips-Silver, J., Toiviainen, P., Gosselin, N., Peretz, I., 2013. Amusic does not mean unmusical: beat perception and synchronization ability despite pitch deafness. Cogn. Neuropsychol. 30 (5), 311e331. Salmon, S., 2010. Listening e feeling e playing music and movement for children with hearing loss [online] Music. J. Aust. Counc. Orff Schulwerk. ISSN: 1320078X 15 (1), 17e21. Availability: http://search.informit.com.au/ documentSummary;dn¼443316633789115;res¼IELHSS [cited 13 Sep 12]. Smith, Z.M., Delgutte, B., Oxenham, A.J., 2002. Chimaeric sounds reveal dichotomies in auditory perception. Nature 416 (6876), 87e90. Spahr, A.J., Dorman, M.F., 2004. Performance of subjects fit with the advanced bionics CII and nucleus 3G cochlear implant devices. Arch. Otolaryngol. Head. Neck Surg. 130 (5), 624e628. Strait, D.L., Kraus, N., Skoe, E., Ashley, R., 2009. Musical experience and neural efficiency- effects of training on subcortical processing of vocal expressions of emotion. Eur. J. Neurosci. 29, 661e668. Strait, D.L., Parbery-Clark, A., Hittner, E., Kraus, N., 2011. Musical training during early childhood ehnaces the neural encoding of speech in noise. Brain Lang. 123, 191e201. Strait, D.L., O'Connell, S., Parbery-Clark, A., Kraus, N., 2013. Musicians' enhanced neural differentiation of speech sounds arises early in life: developmental evidence from ages 3 to 30. Cereb. Cortex. http://dx.doi.org/10.1093/cercor/ bht103. Tierney, A., Kraus, N., 2015. Synchronization and phonological timing skills: the precise auditory timing hypothesis (PATH). Front. Hum. Neurosci. in press. Tierney, A., Krizman, J., Skoe, E., Johnston, K., Kraus, N., 2013. High school music classes enhance the neural processing of speech. Front. Psychol. 4, 855. http:// dx.doi.org/10.3389/fpsyg.2013.00855. Trainor, L.J., Gao, X., Lei, J., Lehotovaara, K., Harris, L.R., 2009. The primal role of the vestibular system in determining music rhythm. Cortex 45, 35e43. http:// dx.doi.org/10.1016/j.cortex.2007.10.014. Tranchant, P., Shiell, M.M., Peretz, I., Zatorre, R., 2013. Dancing in the deaf: beat entrainment through vibrotactile stimuli. In: Poster Presented at the Society for Music Perception and Cognition. Volkova, A., Trehub, S.E., Schellenberg, G.E., Papsin, B.C., Gordon, K.A., 2013. Children with bilateral cochlear implants identify emotion in speech and music. Cochlear Implant. Int. 14 (2), 80e91. Vongpaisal, T., Trehub, S.E., Schellenberg, E.G., 2006. Song recognition by children and adolescents with cochlear implants. J. Speech Lang. Hear Res. 49, 1091e1103. Wallin, N.L., 1991. Biomusicology: Neurophysiological, Neuropsychological, and Evolutionary Perspectives on the Origins and Purposes of Music. Pendragon, Stuyvesant. Wiltermuth, S.S., Heath, C., 2009. Synchrony and cooperation. Psychol. Sci. 20 (1), 1e5. Winkler, I., H aden, G.P., Ladinig, O., Sziller, I., Honing, H., 2009. Newborn infants detect the beat in music. Proc. Natl. Acad. Sci. U. S. A. 106, 2468e2471. Zentner, M., Eerola, T., 2010. Rhythmic engagement with music in infancy. Proc. Natl. Acad. Sci. U. S. A. 107 (13), 5768e5773.
Please cite this article in press as: Phillips-Silver, J., et al., Cochlear implant users move in time to the beat of drum music, Hearing Research (2015), http://dx.doi.org/10.1016/j.heares.2014.12.007
48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94