What’s Going On In Your Child’s Brain When You Read Them A Story?

This content has been archived. It may no longer be accurate or relevant.

From National Public Radio:

“I want The Three Bears!”

These days parents, caregivers and teachers have lots of options when it comes to fulfilling that request. You can read a picture book, put on a cartoon, play an audiobook, or even ask Alexa.

A newly published study gives some insight into what may be happening inside young children’s brains in each of those situations. And, says lead author Dr. John Hutton, there is an apparent “Goldilocks effect” — some kinds of storytelling may be “too cold” for children, while others are “too hot.” And, of course, some are “just right.”

Hutton is a researcher and pediatrician at Cincinnati Children’s Hospital with a special interest in “emergent literacy” — the process of learning to read.

. . . .

While the children paid attention to the stories, the MRI, the machine scanned for activation within certain brain networks, and connectivity between the networks.

“We went into it with an idea in mind of what brain networks were likely to be influenced by the story,” Hutton explains. One was language. One was visual perception. The third is called visual imagery. The fourth was the default mode network, which Hutton calls, “the seat of the soul, internal reflection — how something matters to you.”

. . . .

In the audio-only condition (too cold): language networks were activated, but there was less connectivity overall. “There was more evidence the children were straining to understand.”

In the animation condition (too hot): there was a lot of activity in the audio and visual perception networks, but not a lot of connectivity among the various brain networks. “The language network was working to keep up with the story,” says Hutton. “Our interpretation was that the animation was doing all the work for the child. They were expending the most energy just figuring out what it means.” The children’s comprehension of the story was the worst in this condition.

The illustration condition was what Hutton called “just right”.

When children could see illustrations, language-network activity dropped a bit compared to the audio condition. Instead of only paying attention to the words, Hutton says, the children’s understanding of the story was “scaffolded” by having the images as clues.

“Give them a picture and they have a cookie to work with,” he explains. “With animation it’s all dumped on them all at once and they don’t have to do any of the work.”

Link to the rest at NPR