![]() |
Source: 5D Media & Design, 2011 |
The perception of musical rhythm is extremely complex, involving a lot more than just the auditory cortex. Using neuroimaging techniques and behavioural testing, researcher have begun to examine many facets of rhythm processing, including the neural correlates involved in rhythm perception, the developmentally- and culturally-driven changes in rhythm perception, and the limitations and errors that can occur during rhythmic processing. Using functional neuroimaging, the main brain regions correlated to rhythm perception are the cerebellum, the olivary nuclei of the pons, the prefrontal and parietal cortices[1]. Behavioural studies done in infants show that the ability to perceive rhythmic patterns appears extremely early, but babies show fundamental differences in the cognitive strategies employed during musical rhythm processing[2,3]. The early emergence of rhythm perception in infants fuels the controversial question of whether musical rhythm perception is an evolutionary trait (e.g. a byproduct of a more critical process, like the circadian clock or linguistic rhythmicity) or just a neural fluke[4]. Much of the scientific evidence suggests that musical rhythm processing is indeed a part of a much bigger and crucial process, namely language processing and generation. Thus, the perception of rhythm is not limited to being merely a component of musicality, which enables you to be able to enjoy music, but the inability to perceive rhythm may spell consequences in the linguistic domains as well.
Table of Contents
|
1. The Neural Basis of Musical Rhythm Perception
Many brain regions are involved in the perception of musical rhythm, each active under different conditions for the processing of different types of rhythms. For instance, it has been found that different neural pathways are activated for rhythms of differing complexity[4]. The perception of musical rhythm also elicits characteristic gamma-band brain wave activity, which is used to study the neuroelectric correlates of rhythm perception and anticipation[5].
1.1 The Olivocerebellar Pathway
fMRI of olivocerebellary activation during isochronous rhythm perception. |
![]() |
Source: Teki, Grube, Kumar, & Griffiths, 2011 |
The olivocerebellar system is involved in the perception, but not the motor performance of temporal sequences, such as musical rhythmicity[4]. Since the mid- to late 1900s, it has been hypothesized that the cerebellum and the inferior olive (which is the only source of climbing fibre inputs to the Purkinje cells) may function together as an internal timekeeping device[6,7,8]. Recordings from Purkinje cells in the rodent cerebellum show that the olivocerebellar system exhibits an intrinsic rhythmicity that can be reset by strong extrinsic stimuli through the activity of the inferior olive[7]. Neuroimaging studies suggest that this timing-keeping system may be hijacked in musical rhythm processing. For example, functional magnetic resonance imaging (fMRI) has shown that areas typically important in motor control, such as the cerebellum, are involved in both the processing/encoding and retrieval of auditory rhythms[4,9,10]. Behaviourally, the involvement of the motor system is best understood by our urge to move our body to the music, especially if the song has a strong beat. From these findings, it has been suggested that musical rhythm may be actually processed and encoded as motor movements in the supra-modal motor brain system[9]. Overall, the olivocerebellar pathway has been shown to be activated for simple steady-beat, also known as isometric or isochronous, rhythms[4].
1.2 The Striato-thalamo-cortical Pathway
Functional neuroimaging has shown that the striato-thalamo-cortical pathway is recruited for more complex rhythms, such as those you would typically hear in music[4]. Studies have also shown activation in the basal ganglia during perception of an isometric beat within a sequence of temporal events[4,11,12]. Specially, the striatum seems to be most active when the brain is predicting the timing of the subsequent temporal event based on past regular beats, but only minimally contribute to the initial finding/establishment of rhythmicity[11]. The importance of the basal ganglia in musical rhythm perception is further illustrated in the impairment in rhythm discrimination in patients with Parkinson’s disease (and thus basal ganglia dysfunction)[13]. The thalamus is involved in the very early stages of rhythm perception, as the gateway for the auditory signals from the cochlea to the cortex[14]. It is at the level of the thalamus, along with the midbrain superior colliculi, that pre-processing of musical properties begins. It’s only after the auditory features have been extract that segregation of melodic, rhythmic, timbral, and spatial components can occur[14]. Of the cortical activations, the most distinct and consistent activation is in the premotor cortex[4,9,10]. There is some evidence that suggests there is differential activation of the premotor cortex depending on whether subjects were consciously encoding a musical rhythm or not. Chen and colleagues found that the ventral premotor cortex was only activated when subjects were told to before the task to tap along to the rhythm (i.e. listening with anticipation and concurrent motor reproduction), while the dorsal premotor cortex was active during movement synchronization as well as responding to the more complex components of the rhythmic stimuli, such as meter[10].
1.3 Brainwave Activation: Evoked Gamma Activity
Method of calculation for gamma band activity for evoked and induced conditions. |
![]() |
Source: Synder & Large, 2005 |
Studies using electroencephalography (EEG) and magnetoencephalography (MEG) have identified different event-related potentials (ERPs) with short-, middle-, and long-latencies that correspond to the onset of auditory stimuli[15]. In particular, rhythmic patterns elicit high-frequency gamma band (20-60Hz) activity in the auditory cortex[16]. The elicited gamma band activity (GBA) can either be evoked or induced. These two types of oscillatory potentials differ in their phase-relationship to the auditory stimulus[5]. Evoked GBA is considered to be phase-locked, meaning that they are temporal fixed to the oscillatory rhythm of the stimulus, independent of its tempo[17]. This means that evoked GBA is often associated with the perception of rhythmic events as they are happening, and an omission of an expected tone greatly diminishes evoked GBA[15]. Based on their common origin and similar peak latencies, evoked GBA may also be associated with the middle-latency component of rhythmic ERPs[5]. Induced GBA is phased-independent from stimulus onset, meaning it’s more temporally variable and may be associated the less stimulus-driven components of rhythmic processing, such as anticipation and expectation of the subsequent stimulus onset[5]. Thus, the omission of a predicted tone from a rhythmic sequence does not diminish the induced GBA[15].
2. Musical Rhythm Perception during Early Development
The ability to detect and respond to musical rhythm emerges quite early in child development. The coupling of motor movement to a musical rhythm has been shown to emerge in preverbal infants as young as 5 months old[13,18]. These motor responses are specific to simple musical rhythms, including isochronous drumbeats, but do not extend to verbal language (even though it does have a subtle rhythmicity to it), suggesting that the ability to detect rhythms in early infancy is limited to more regular, steady beats[18]. More consistent motor responses and the ability to change the speed of their movements with perceived change in the tempo of the song do not emerge until the baby is around 9 month old[18]. It is important to note that the motor response discussed in relevant studies do not refer to movement that is synchronized to the stimulus (i.e. the baby is not moving to the beat, although the type of movement is rhythmic), as the degree of motor control needed will not be achieved until the child is in kindergarden[18]. The early emergence of motor coordination observed with rhythm perception coincides with the early development of the vestibular system[18]. Since the vestibular system is also involved in the transfer of rhythmic auditory input to locomotive output, the observed motor response may be indicative of a vestibular-auditory interaction that needs to be established early for the development of proper musical perception[2].
2.1 Rhythm and Motor Activity: Innate vs. Entrained Response
Whether the motor engagement observed with musical rhythm is innate or a socially/ behaviourally entrained response is a topic of debate. A study done in preschool children has shown that having a proper social context that encourages the perception and reproduction of musical rhythms increases children’s motor coordination in a drumming task[19]. Also, engagement in musically rhythmic tasks is associated with positive emotional affect, which may motivate and thus promote future motor coordination to metrically organized patterns[18]. However, there has also been some evidence that suggest the ability to perceive auditory rhythms is innate. EEG recordings show that neonates only 2-3 days old exhibit expectancy of succeeding rhythmic cycles after exposure to an isochronous rhythm, without the need for cues such as accenting or stressing of certain beats[20]. Also, due to the early development of the vestibular system and the early emergence of locomotive rhythmic engagement, it is likely that the motor activity is more automatic rather than acquired through effortful entrainment processes, which would require much longer to establish[2,18]. Finally, the amount of previous musical exposure does not correlate to rhythmic motor responses in infancy[18], which seems to suggest that the ability to detect rhythm is innate, but responses are uncoordinated only due to lack of muscle control during infancy.
2.2 Constraints on Rhythm Perception: The Effects of ‘Enculturation’
A person’s musical perception is based on the interactions of intrinsic constraints outlining musical preference[21,22] and extrinsic culture-specific exposure that determine cognitive schemas of musical structures[23]. The acquisition of musical rhythm perception, like that of the perception of musical pitch, is similarly also a product of these two factors: the perception of basic components of rhythm and meter may be innate or developed early in life, but the responses these rhythmic structures elicit will dependent on past experiences[3]. Young infants and even newborns exhibit preference to musical meter commonly used in music of their own culture[24]. This preference may be acquired in utero since the fetus is exposed to auditory stimulation even prior to birth[25].
Music from different cultures differs in their underlying interval ratio complexity, which is which is the ratio of long to short drumbeats per time interval[3]. Western music typically contains nearly isochronous beats, and thus has a low complexity ratio (e.g. 4/4 time signature or a 2:1 interval ratio). Music from other cultures, such as Bulgarian or Macedonian music, has a medium interval ratio complexity (e.g. 7/8 time signature or a 3:2 interval ratio)[3]. At six months, Western infants are still able discriminate rhythmic disruptions in either a 3:2 or 2:1 context, but when Western infants reach a year old, they could only discriminate rhythms of simple 2:1 ratios, suggesting that the cultural effects on metrical perception begin to emerge when the child is between 6-12 months[26,27]. However, prior to acquiring the effects of enculturation, when the effect of temporal interval ratio complexity on ability to discriminate rhythmic patterns was tested on 5- and 7-month old infants, increased complexity of the musical rhythm correlated to an decreased accuracy in the discrimination task[3]. This finding is in contrast to adults, for which task accuracy is determined by their cultural background (i.e. dependent on the amount of exposure to the rhythm), and suggest that rhythm complexity constrains accurate rhythm perception before the development of enculturation biases[3].
3. Beat Deafness
Beat deafness is a form of congenital amusia, characterized by the inability of these individuals to perceive or produce musical rhythm, despite having seemingly normal hearing, cognitive, and motor function[28,29]. The first ever documented case study of one such individual, under the alias “Mathieu”, reported that he was unable to synchronize his movements to the beat of a range of musical selection. However, he was able to coordinate his motor activity with a metronome, suggesting that the “deafness” is actually a difficulty in detecting rhythm underneath the complex covering of music[28]. His beat deafness does not interfere with his normal cognitive functioning or his linguistic abilities. The discovery of this pathological condition is very recent and details of the etiology of beat deafness are not yet understood.
4. The Evolutionary Origin of Rhythmicity and its Significance
The ability to process rhythm in our surroundings is critical to have, especially with respect to the acquisition of language and the musical comprehension and production. Both domains require the organization of sensory information into structured rhythmic sequences, and the retention of this rhythm information until it is used to reproduce or mimic the same rhythm later[9]. Evolutionarily speaking, it is still unclear how the perception of musical rhythm came to be. One possibility is that the neural wiring underlying rhythm perception emerged as a direct target of natural selection for music; another possibility is that it evolved as a by-product for some other function, such as language or the circadian clock, and just happens to be relevant for music processing[18,30]. The link between language and musical processing is well established, (see relevant Wikidot page, The Association between Language and Music Processing, for more details) though with the lack of linguistic deficits seen in beat deaf individuals show that the link is likely not straightforward.
Yes! I found you! As always your work is fabulous. I wanted to give you kudos on mentioning Isabelle Peretz' work. I met her a few weeks ago and her work is reaaaallly cool. She just started using whole genome wide sequencing to look at possible genetic markers for amusia and using DTI to map white matter tract differences. I asked her a question about rap music (since verbal fluency is tied to beat perception more strongly in rap). Anyway, it would be cool to expand on the section a bit if you have time - seeing what happens when comprehension doesn't happen properly gives you insight into the mechanism when it works well. Maybe just adding a video of Mathiu doing a task - I remember being blown away when I first saw it because it really makes you understand what's happening. Cheeeeers!
Hey there groupmate :p
Other than the last image not showing, this looks awesome!! Good job! :)