Research Roundup: New information on infant language acquisition

By

Researchers calculate how much data humans must store to learn English

To learn to speak English, humans must absorb approximately 12.5 million bits of information from infancy to young adulthood, according to calculations in a new study.

The researchers used a branch of mathematics called information theory to calculate how much data is associated with each component of linguistic knowledge, from word meaning to sentence syntax to paragraph tense.

“Ours is the first study to put a number on the amount you have to learn to acquire language,” said study senior author Steven Piantadosi, an assistant professor of psychology at UC Berkeley. “It highlights that children and teens are remarkable learners, absorbing upwards of 1,000 bits of information each day.”

The team found that most of the linguistic knowledge humans learn is about the meaning of words, rather than grammar and syntax.

“This really highlights a difference between machine learners and human learners,” Piantadosi said. “Machines know what words go together and where they go in sentences, but know very little about the meaning of words.”

Learn more at Sciencedaily.com or read the study in the Royal Society Open Science journal.

Infants’ advances in speech perception shape their earliest links between language and cognition

A new study from Northwestern University offers fresh insights into how infants learn to link language with cognition.

The researchers played foreign languages to English-speaking infants while presenting them with familiar and unfamiliar objects. They measured the amount of time the infants spent looking at each object to determine whether or not they understood it.

“We found that German, which is phonologically ‘near’ to English, facilitated object categorization. But Cantonese, which is phonologically ‘distant,’ did not,” said Danielle Perszyk, lead author of the study.

These findings indicate that the 3- to 4-month-old infants had already begun to tune to the sounds of their native language, the researchers said, indicating that they were setting constraints around which language sounds would be linked to cognition.

“At 3 and 4 months, this link is not exclusive to human language: listening to vocalizations of nonhuman primates also supports infant cognition. By 6 months, infants have tuned this link to human speech alone. This study provides evidence that infants’ increasing precision in speech perception shapes which signals they will link to cognition,” the authors wrote.

Read about the study at Northwestern.edu or access the research at Nature.com.

Toddlers learn language better in predictable situations

Young children learn language more easily in predictable environments, according to a study by researchers at Arizona State University.

The university’s website reports:

“During the experiment, toddlers sat on their parent’s lap in front of a large screen The screen showed four closed boxes, one in each corner of the screen. Inside the boxes were pictures of novel and unfamiliar shapes. In the first part of the experiment, the boxes opened one at a time and always in the same order. The sequence of box openings was predictable, but the object inside the box was not.

After five repetitions of the boxes opening and closing in the same order, the researchers started giving the objects names. Like the objects, the names were also novel, such as “pisk,” “bosa” or “tulver.” The children heard the name of the object after the box opened and their gaze was fixed on that box. The researchers tracked where the children were looking with a special camera that was mounted beneath the screen.

On half of the trials when the children heard names for the objects, the boxes opened in the expected order. On the other half of the object-naming trials, a box would open out of order.”

“This experimental design separated the content of what the children were learning from the predictability of the situation,” said Viridiana Benitez, assistant professor of psychology and lead author of the study. “We were able to show that just predicting something, like when and where to look, had a cascade effect on learning something new.”

The findings are helpful for parents and teachers to know about, said Linda Smith, a professor at Indiana University.

“We have known for some time that infants and children are amazing statistical learners,” Smith said. “The findings from this study have important implications for both education and parenting. Regularities in the everyday lives of children and in the classroom can support learning.”

Read more at Asunow.asu.edu or in Current Biology.

LENA Team

The LENA Team is a dedicated group of professionals who are passionate about increasing awareness of the importance of early interactive talk. We are statisticians, speech-language pathologists, curriculum specialists, engineers, and linguists.

Related Posts

Please Give Us Your Thoughts

All comments are reviewed before being posted.

LENA is committed to protecting and respecting your privacy.

You may unsubscribe from these communications at any time.