It’s remarkable with the onslaught of all the incoming and outgoing information that the brain can process and differentiate all the stimuli into coherent pieces of the puzzle which we call life. It can’t always multi-task and has definite limitations—but have you ever wondered how it can keep up? An exciting study shows us how the brain responds to the tiny differences in verbal communication which make all the difference!
Researchers from the Max Planck Institute of Psycholinguistics partnered with Radbound University to understand what makes the brain receive one set of sentences in a different way from a very similar sentence. Until now, it’s largely remained a mystery; and with very little external cues from speech, scientists weren’t sure how the brain was able to work with such abstracts.
Using sentences was an ideal way to test how the brain sees speech because many sentences are made up of the same smaller segments, like words and phrases, but the way these words are arranged in a sentence makes a huge difference to the actual meaning of the sentence. To discover how the brain might be able to contain such a vast network of speech integration, researchers watched EEG scans of the brain while simple sentences were spoken.
These simple sentences had an identical number of syllables and often had the same meaning to mimic what occurs in everyday speech. For instance, “the vase is red” is similar to “the red vase.” After these types of sentence configurations were spoken, adult participants were asked to push buttons for three tasks which asked them if they had heard a complete sentence or a phrase, and the other asked the participants if the images they were shown next correctly corresponded to the color or the object as the spoken sentence.
The EEG scans showed that the brain responded differently for phrases and sentences. The neurons themselves fired in different patterns between these types of word configurations with differences in neural connectivity and timing.
Knowing that timing plays an integral role in understanding speech is an important discovery. Despite our ever-growing reliance on computer interfaces, computer interfaces still aren’t able to comprehend verbal language as well as the human brain! The research team is hoping to apply this to future studies to elaborate on the complexities of spoken language.
Fan Bai, Antje S. Meyer, Andrea E. Martin. Neural dynamics differentially encode phrases and sentences during spoken language comprehension. PLOS Biology, 2022; 20 (7): e3001713 DOI: 10.1371/journal.pbio.3001713