Login with HarvardKey to view all events.

Event Dates

Tuesday, October 26, 2021 4pm to 5pm

Virtual Event
Add to calendar

The perception of rhythmic patterns in auditory sequences is important for many species, ranging from crickets and birds recognizing conspecific songs to humans perceiving phrase boundaries in speech or beat patterns in music. Humans can recognize the cadence of a friend’s voice or the rhythm of a familiar song across a wide range of tempi. This shows that our perception of temporal patterns relies strongly on the relative timing of events rather than on specific absolute durations. This tendency is foundational to speech and music perception, but to what extent is it shared by other species? Growing evidence suggests that human rhythm perception relies on auditory–motor interactions even in the absence of movement. Given that vocal learning species have evolved neural adaptations for auditory-motor processing and communicate using acoustic sequences that are often rhythmically patterned, we hypothesize that such species are advantaged for flexible auditory rhythm pattern perception. Consistent with this “vocal learning and rhythmic pattern perception” hypothesis, we show that a vocal learning songbird robustly recognizes a basic rhythmic pattern (isochrony) independent of rate, far outperforming vocal non-learning rats tested in analogous prior research. Our findings pave the way for neurobiological studies to identify how the brain represents and perceives the temporal structure of auditory sequences.

Event Details

User Activity

No recent activity