WASHINGTON (AP) - Babies don’t learn to talk just from hearing sounds. New research suggests they’re lip-readers too.
It happens during that magical stage when a baby’s babbling gradually changes from gibberish into syllables and eventually into that first “mama” or “dada.”
Florida scientists discovered that starting around age 6 months, babies begin shifting from the intent eye gaze of early infancy to studying mouths when people talk to them.
“The baby in order to imitate you has to figure out how to shape their lips to make that particular sound they’re hearing,” explains developmental psychologist David Lewkowicz of Florida Atlantic University, who led the study being published Monday. “It’s an incredibly complex process.”
Apparently it doesn’t take them too long to absorb the movements that match basic sounds. By their first birthdays, babies start shifting back to look you in the eye again _ unless they hear the unfamiliar sounds of a foreign language. Then, they stick with lip-reading a bit longer.
“It’s a pretty intriguing finding,” says University of Iowa psychology professor Bob McMurray, who also studies speech development. The babies “know what they need to know about, and they’re able to deploy their attention to what’s important at that point in development.”
The new research appears in this week’s issue of the Proceedings of the National Academy of Sciences. It offers more evidence that quality face-time with your tot is very important for speech development _ more than, say, turning on the latest baby DVD.
It also begs the question of whether babies who turn out to have developmental disorders, including autism, learn to speak the same way, or if they show differences that just might provide an early warning sign.
Unraveling how babies learn to speak isn’t merely a curiosity. Neuroscientists want to know how to encourage that process, especially if it doesn’t seem to be happening on time. Plus, it helps them understand how the brain wires itself early in life for learning all kinds of things.
Those coos of early infancy start changing around age 6 months, growing into the syllables of the baby’s native language until the first word emerges, usually just before age 1.
A lot of research has centered on the audio side. That sing-song speech that parents intuitively use? Scientists know the pitch attracts babies’ attention, and the rhythm exaggerates key sounds. Other studies have shown that babies who are best at distinguishing between vowel sounds like “ah” and “ee” shortly before their first birthday wind up with better vocabularies and pre-reading skills by kindergarten.
But scientists have long known that babies also look to speakers’ faces for important social cues about what they’re hearing. Just like adults, they’re drawn to the eyes, which convey important nonverbal messages like the emotion connected to words and where to direct attention.
Lewkowicz went a step further, wondering whether babies look to the lips for cues as well, sort of like how adults lip-read to decipher what someone’s saying at a noisy party.
So he and doctoral student Amy Hansen-Tift tested nearly 180 babies, groups of them at ages 4, 6, 8, 10 and 12 months.
How? They showed videos of a woman speaking in English or Spanish to babies of English speakers. A gadget mounted on a soft headband tracked where each baby was focusing his or her gaze and for how long.