Early Childhood Q&A

Adjectives, Social Cues, Screens and More: What Scientists Know About Baby Brains

By Lillian Mongeau — March 13, 2015 6 min read
  • Save to favorites
  • Print

Sarah Roseberry Lytle, the director of outreach at the Institute for Learning and Brain Sciences at the University of Washington, hosted a popular session on early cognitive development at SXSWedu in Austin last week.

The national education conference is known for drawing people interested in science and technology, so Roseberry Lytle’s focus on brain science proved a draw. Twitter users, who were legion at the conference, used the hashtag #babybrains to comment on her session.”

“The type of language we use (quality) is just as important as the quantity to infant language acquisition,” tweeted attendee @LanetteW.

“Such a great panel—amazing differences between mono & bilingual cognitive learning,” tweeted another under the handle @TurnAround.

Roseberry Lytle earned her Ph.D. in Developmental Psychology at Temple University, where she worked with Kathy Hirsh-Pasek. Her research focused on the role of social interactions in infants’ and toddlers’ language learning and how social cues might help toddlers learn from screen media. We caught up with her after her session to ask her some more questions about the research coming out of the University of Washington institute.

The interview below has been edited for length and clarity.

There’s been a lot recently about talking to kids and the importance of talking to kids. Can you tell me, in a nutshell, what does the research say about small children and language?

Well, language is one of those huge cognitive advances that children acquire during the first few years of life, and so we know that language is very important. Children learn language by hearing language and by participating in conversations with caregivers and their families. It’s not just the quantity of language that children are hearing that matters, though. What research is telling us now is that the quality of the language children hear is incredibly important, perhaps even more important than the quantity.

When we talk about quality of language, we’re talking about something we call “infant-directed speech” or “parentese” or “motherese.” It’s that sing-song tone of voice that we tend to use when we talk to infants and children, and we know it really helps kids learn language. In fact, it’s not just this funny speech signal that we’re using. Kids actually learn more words when they hear it, and they babble more, and they show larger vocabularies later on in life.

And when you think about the quality of language, you’re also thinking about the kinds of words that we’re using. It’s important not just that we use nouns and object words, for example. We don’t want to just label the things around us. We want to talk about them. So instead of the table and chair, we want to talk about the round, brown table and the tall chair that’s very sturdy.

There’s been a lot of heartburn over whether or not low-income parents are doing this, or are able to do this. And if we’re saying they should do it, are we imposing middle-class values on a group of people who don’t want those values? Where do you come down on that from a scientific standpoint?

Parents and caregivers have all the tools that they need to do this. It’s not the case that you have to go out and buy any special programs or tools to really help you do this; it’s simply talking about things in your environment and really providing a little bit of narration for kids. Talk about the sounds you hear. So the bus goes “vroom, vroom.” Point out the road signs you see along the way, the things that you’re passing. That’s a building. That’s a bicycle. Where does the bicycle go? Have you ever ridden a bicycle? Do you remember the last time you rode a bicycle? Where do you think the man on the bicycle is going? Asking kids to make predictions like that really will help their cognitive development in addition to the language development.

Babies can’t talk until they’re 1 1/2-years-old. Is it still important that their parents are talking to them?

Absolutely. We know that kids are starting to pick up on language even before day one. We know that hearing begins in the third trimester of pregnancy, and if you test kids even hours after birth they can distinguish their mom’s native language from foreign languages.

What that tells us is that experiences kids have had, even when they’re in the womb, really do start to develop their later language skills. So even though kids aren’t able to talk until they’re a little bit farther down the road, it’s still important that we’re giving them all of those rich experiences.

What tools are available for teachers and parents who are unsure of how to get started with this, or feeling a little silly about talking to a baby that’s not responding to them?

At ILABS, one of the things that we have done is create an online library of modules. They’re free to access, and each module takes about a 20-minute deep dive into a particular area of research. The idea is we really want to communicate the latest research in child development to a variety of caregivers and providers and parents and families.

Another topic that comes up a lot with this language question is, “My kid’s watching ‘Sesame Street’ which is for children. What’s wrong with that? Sure, I’m not talking to them but they’re hearing language.” What do we know about the difference between hearing language on TV or from a computer program and in-person human language?

There’s been a lot of research lately on how kids can learn from screens and when they learn from screens and the context in which they learn from screens. In general, we know that kids don’t seem to learn a lot from screen media until they’re about 2 1/2-years-old or so. And that depends on the type of programming and whether or not an adult is watching the television with the child, but in general that seems to be the rule.

After about 2 1/2 years of age, kids seem to be able to learn from screen media even though they still continue to learn better from a live human person. Certainly under the age of 2 1/2, kids are going to learn better from that natural and live human interaction. Kids really seem to need that live social support in order to learn.

We know that when kids get a little bit older they really can learn from screen media. We also find that the more interactive a screen media is, the better kids are able to learn from it, and the more that screen media encourages that kind of social interaction the better it is.

For example, there’s been some research looking at kids’ ability to learn from Skype interaction. [As a video conferencing service,] Skype is a screen medium, but it provides that opportunity for a live social interaction. And it turns out that kids are able to learn from video chat technology almost the same way that they are from live interactions. That tells us it’s not necessarily the screen per se that’s preventing kids’ learning; it’s that lack of back-and-forth interaction that kids are able to have in a live human situation.

Is there anything I left out or something new you really want to make sure we know?

A lot of the [brain development] research we’re doing at ILABS suggests the importance of early experiences over and over again. We know now, more than ever, that early experiences really do shape the physical architecture of a child’s brain. So having quality early experiences is very important.

Photo: Sarah Roseberry Lytle

A version of this news article first appeared in the Early Years blog.