Electrophysiological Correlates of Semantic Dissimilarity Reflect the Comprehension of Natural, Narrative Speech
- Journal:
- Current Biology
- Published:
- DOI:
- 10.1016/j.cub.2018.01.080
- Affiliations:
- 10
- Authors:
- 5
Research Highlight
Spotting light-bulb moments in the brain
© Tim Teebken/Getty
A brain signal that reveals when a person has understood natural speech could provide clues to consciousness in coma patients and allow early detection of dementia.
People speak at rates of 120 to 200 words per minute. We barely notice our brains computing this deluge of dialogue, instantly interpreting syllables and sentences so we can understand. But studying how our brains process the meaning of words in context is tricky.
A team led by researchers from Trinity College Dublin combined brain monitoring technology with machine learning to help identify key moments of understanding when listening to narrative speech.
Electroencephalography (EEG) monitors brain activity through wire electrodes placed along the scalp that detect electrical signals as the brain responds to stimuli such as sounds, sights and smells. It has been used to study the brain’s response to incongruous words in a sentence, but not for studying its response to natural speech.
The researchers used a computer model to evaluate the significance of words in audiobooks based on how each word related to previous ones in a sentence or paragraph. Participants then listened to the audiobooks while their brain activity was monitored.
Words carrying the most meaning should evoke the strongest EEG response, and indeed large spikes occurred within a few hundred milliseconds of words that the model predicted would trigger understanding. This suggests that our brains almost instantaneously calculate each word’s semantic similarity to previous words. The strong signals originated from the central parietal lobe — a brain region that processes sensory information.
The signals vanished when participants listened to the audiobook in reverse so that it sounded nonsensical. The team also took EEG readings while background noise made listening difficult and when the listener had to distinguish between two competing speakers, simulating real-world situations such as a noisy bar. In both cases, the EEG readings showed impaired comprehension.
The researchers note that the signal is highly sensitive to “whether subjects understand the speech they hear and whether they are paying attention to that speech,” adding that, “the time taken to semantically process different words necessarily depends on the words themselves, their context and listening conditions.”
This relatively cheap and simple test could help doctors detect early signs of dementia and provide vital information to families and caregivers. Ed Lalor of Trinity College Dublin said, “Potential applications include testing language development in infants, or determining the level of brain function in patients in a reduced state of consciousness.”
Envisaging a high-tech future, Lalor believes that wearable EEG devices could instantly confirm whether a person in a particularly dangerous job, such as a soldier or pilot, has understood instructions.
References
- Cell Biology 28, 803–809 (2018). doi: 10.1016/j.cub.2018.01.080