91Թ-led study looks at difference in brainwaves when people speak in mother tongue or second language
When two people converse, their brainwaves manage to get in synch.
According to a new 91Թ-led study, the same happens when people speak their non-native tongue, just in different areas of the brain.
“You can tell two people are having a conversation just by looking at their brainwaves, because they align with each other,” says Alejandro Pérez, a postdoctoral researcher in the Centre for French and Linguistics and the department of psychology at 91Թ Scarborough.
The study adds to a growing body of research on interbrain neural coupling, which is essentially how two people’s brainwaves synchronize when they hold a conversation.
Past research co-authored by Pérez found that when two people are talking, neurons firing in specific areas of the speaker’s brain show a corresponding pattern of activity to the neurons in specific areas of the listener’s brain. Not only are specific areas in the brain activated, there’s a pattern in the timing of that activation as well.
For this particular study, an international team of researchers led by Pérez wanted to see if and where in the brain this synchronization takes place when two people are speaking in their non-native language.
To do this, researchers recorded the brain activity of 60 participants (all native Spanish speakers with some proficiency in English) using electroencephalography (EEG) while conversations took place. Half of the conversations took place entirely in Spanish, with the other half entirely in English.
When participants talked in their second language, the activations took place in different areas of the brain, researchers found.
“The language used influenced the alignment of brainwaves between those having the conversation, and this suggests that effective communication could be based on this interbrain neural coupling,” says Pérez.
Pérez says the difference in pattern could come down to attention. For those conversing in a non-native language, attention to the message is made in smaller chunks since they are struggling to precisely understand and produce every single word.
He adds it could also be that emotional attachment is different among native and non-native language speakers to certain words, or the capacity to form a mental image of words is different between someone conversing in their first or second language. Pérez says non-native speakers also learn the language later in life so they never quite master the structures of that language in the same way as someone who grew up with it.
“This work gets at a foundational issue, which is pretty remarkable, and that is how our brains seem to synchronize when we hold a conversation,” says Pérez’s co-supervisor Philip J. Monahan, assistant professor at the Centre for French and Linguistics at 91Թ Scarborough.
“In the past we mostly studied production and perception of speech independently of each other, often using very simple syllables or sentences. Rarely did we look at the link between the two, especially in a naturalistic way using entire conversations.”
Monahan, who is an expert on psycho- and neurolinguistics, says the research could be particularly important in multicultural societies where many people are often speaking more than one language, or switching from one language at home to another in public.
“Being able to understand how the brain goes between different linguistic structures and copes with the differences could really help us understand how the brain supports multiple linguistic systems in the first place,” he says.
While this study looked at two languages that are relatively similar, Monahan says it would be interesting to see how the brain switches between vastly different languages, say English and Tamil.
Pérez says it’s conceivable that measuring brain activity could one day offer an objective measure of the quality of a conversation. It may also be able to tell us whether both parties fully understood each other based on how their brainwaves synchronize during the conversation.
“Imagine two people talking over Skype and they’re hooked up to a simple EEG device,” he says. “The data gathered may offer some indication about the quality of the communication between two people depending on the alignment of their brain activity.”
, was supported by the Natural Sciences and Engineering Research Council (NSERC) and the Social Sciences and Humanities Research Council (SSHRC).