Scientists are discovering how our brains transform language into meaning. Developing new technology for visualising how the semantic system is distributed across the cerebral cortex, researchers at the Gallant Lab hope to help future patients with locked-in syndrome (ALS), dyslexia and autism.
A team of scientists from the Gallant Lab, University of California, are aiming to create functional ‘semantic maps’ of our brains. Lead scientist of the project, Professor Jack Gallant, explains that their “goal was to build a giant atlas that shows how one specific aspect of language is represented in the brain, in this case semantics, or the meanings of words”. By mapping our semantic system, the scientists hoped find out more about the functional and anatomical organisation of our semantic systems – or what’s really going on inside our heads when we tune into the radio.
Seven English-speaking subjects were asked to listen to autobiographical stories from The Moth Radio Hour, a popular US radio show. Whilst the subjects happily eavesdropped on personal anecdotes, the scientists used Functional Magnetic Resonance Imaging (FMRI) to indirectly measure the subjects’ brain activity. Through detecting changes in blood flow, oxygenation, and volume; the scientists were able to work out which parts of the cerebral cortex were activated by different words.
Peering into the hidden recesses of the brain, the researchers discovered some interesting results. For example, the word ‘victim’ lights up the same part of the brain as ‘murdered’. The scientists also discovered the subjects had ‘remarkably similar semantic maps’, probably due to their shared English language. As a result, the researchers are now planning further research to find out if our foreign neighbours’ semantic maps vary due to differences in language and culture. Finally, and rather surprisingly, the study suggests language appears to be processed in both sides of the brain – not just the left hemisphere, which is often touted as solely responsible for language.
Sounding like something straight out of science-fiction, scientist Alexander Huth believes their new ‘approach could be used to decode information about what words a person is hearing, reading, or possibly even thinking” . Whilst the team still have far to go before a general purpose language coder hits the shelves, the research could help improve our understanding of communication disorders, such as dyslexia and autism; as well as to create new treatments to aid patients recover from brain injuries and strokes. Jack Gallant and his team are planning more research to verify their semantic atlas. They are also keen to apply the approach for mapping other kinds of language information, such as phonemic information, syntactic information and narrative.
Learning how to read the brain is certainly an exciting prospect, and the basic building blocks of language seems the perfect place to start.