According to a new study, speakers of tonal and non-tonal languages may perceive music differently.
Research conducted by The Music Lab – a collaborative department from The University of Auckland and Yale University, revealed that tonal language speakers have a better ability to differentiate subtly contrasting melodies, while non-tonal speakers are more astute to rhythmic differences.
The study compared almost half a million people, speaking 54 different languages, from 203 countries and concluded that the advantages experienced by each set of speakers – melodic perception for tonal speakers and rhythmic perception for non- tonal speakers, was equivalent to approximately half the cognitive boost gained by music lessons.
Dr Courtney Hilton, a cognitive scientist at Waipapa Taumata Rau (University of Auckland) who co-led the research, says “We grow up speaking and hearing one or more languages, and we think that experience not only tunes our mind into hearing the sounds of those languages but might also influence how we perceive musical sounds like melodies and rhythms”.
Focusing on music and psychology, The Music Lab is a University of Auckland and Yale University collaboration investigating how the human mind creates and perceives music.
The researchers first investigated pitch and inflection. Non-tonal languages like English often use pitch to inflect emotion or to signify a question, however raising or lowering pitch never changes the meaning of a word. Contrastingly, tonal languages like Mandarin use sound patterns to distinguish syllables and words.
Participants included speakers of Mandarin, Cantonese, Thai and Vietnamese (tonal) and speakers of English (non-tonal) and were then given musical tasks varying in difficulty, to distinguish differences in melody and rhythm. The musical examples included some mismatched beats and mis-tuned vocals.
Results suggested that the type of language spoken impacted melodic and rhythmic perception, but did not affect people’s capacity to recognise whether vocals were in tune.
Research co-leader Jingxuan Liu explained “Native speakers across our 19 tonal languages were better on average at discriminating between melodies than speakers of non-tonal languages, and similarly, all 19 were worse at doing the beat-based task,”.
The evidence that tonal speakers are at a slight disadvantage when discerning rhythm, came as a surprise to researchers, but it could be explained by an overriding perception for pitch.
“It’s potentially the case that tonal speakers pay less attention to rhythm and more to pitch, because pitch patterns are more important to communication when you speak a tonal language,” says Hilton.
Questions remain as to whether different languages inherently affect musicality, and the team conclude that environmental and cultural variables must be taken into account.
“Prior studies mostly just compared speakers of one language to another, usually English versus Mandarin or Cantonese,” says Liu. “English and Chinese speakers also differ in their cultural background, and possibly their music exposure and training in school, so it’s very difficult to rule out those cultural factors if you’re just comparing those two groups.”
“We still find this effect even with a wide range of different languages and with speakers who vary a lot in their culture and background, which really supports the idea that the difference in musical processing in tonal language speakers is driven by their common tonal language experience rather than cultural differences,” she adds.
Despite promising evidence that there is variation in musical processing ability between speakers of tonal and non-tonal languages, researchers maintain that there is scope for more study into these smaller-scale patterns and environmental variables. Similarly, more research is needed to understand the cognitive mechanisms and developmental pathways behind the differences.