
Podcasts, audiobooks, and digital storytelling, in general, have spiked over the last few years, making commute tolerable and helping us evade even if we don’t have a book or movie at hand.
But researchers at the University of California, Berkeley, were curious what happens in our brains when we listen to podcasts, so they had volunteers listen to stories from “The Moth Radio Hour” while their brain activity was recorded by an fMRI machine.
This experiment has helped them gain insight into the so-called “semantic system” in the human brain. The results could one day be translated into treatments for injuries that affect the ability to speak.
Alex Huth, the study’s lead author and one of the volunteers, listened to more than two hours’ worth of podcast episodes while positioned inside an fMRI machine. This helped him and his team create a map of sorts that shows how words are interpreted in the brain.
“Our subjects love to be in this experiment because they can just lie there and listen to these really interesting stories,” explained Huth.
However, because they kept laughing and destroying fMRI data with their movement, the researchers had to 3-D-print personalized “head cases” for each subject to maintain their heads stable.
The brain’s outer layer of tissue – the cerebral cortex – is known for its role in some of our higher functions, such as language abilities. Thanks to the fMRI recordings, researchers were able to see how this brain area reacted to the storytelling.
But instead of just observing which parts of the brain lit up, they tried to match the active region to particular words that played when that part fired up.
This process allowed researchers to create a map of word clusters linked to activity in various parts of the brain. Unlike previous beliefs, they noticed that both halves of the brain were involved in the semantic system, as over 100 areas in the cerebral cortex had activated during the experiments.
If a word had more meanings or was part of a memory network, more parts of the brain flared up. Take the word ‘dog,’ for example: when we hear it, we automatically envision one, already sense the smell and the feeling of the fur. But we also think about our childhood dog and the memories we have of it.
More than showing which parts of language fall in which region of the brain, the study allowed researchers to focus on a particular area or category of word types to see how they’re processed by the brain.
Image Source: Neurowiki
Latest posts by Nancy Young (see all)
- Missouri Man Robbed by Date and Accomplice in Park - June 22, 2018
- Bose Poised to Launch Sleepbuds, In-Ear Headphones That Help You Sleep - June 21, 2018
- Russia Is Developing a Space Debris Laser to Keep Space Clean - June 15, 2018









