
Babies are natural learners, constantly observing and processing information about the world around them. If you’ve ever noticed an infant staring at you while you speak, it’s not just curiosity—they are actively absorbing how speech is formed.
A new study published in Developmental Science reveals that this process starts as early as four months old. Previously, researchers believed that infants only began refining their ability to distinguish language sounds between six and twelve months. This groundbreaking finding suggests that babies are capable of recognizing how different speech sounds are physically produced much earlier than expected.
This discovery may have significant implications for early intervention in children at risk of speech or language delays.
A head start in language processing
By their first birthday, babies begin fine-tuning their perception of language sounds—a process known as perceptual attunement—to focus primarily on the sounds of their native language.
However, this study shows that infants as young as four months can differentiate between sounds from languages they have never been exposed to. For instance, an English-speaking infant might recognize subtle Hindi or Mandarin sound contrasts that would be challenging for an adult English speaker to detect.
This ability gradually diminishes between six and twelve months as babies become more attuned to the language they hear most often.
Until now, researchers believed this refinement was necessary before babies could recognize more complex linguistic patterns, such as differentiating between consonants like “b” and “d.” However, this study indicates that even at four months, babies are already developing an understanding of how speech sounds are formed.
The mini-language experiment
To explore this early cognitive ability, researchers conducted an experiment involving 34 infants between four and six months old. Parents provided consent for their children to participate in the study, which used two invented mini-languages.
One language contained words using lip-based sounds, such as “b” and “v,” while the other used tongue-tip sounds, like “d” and “z.” These words, including examples like bivawo and dizalo, were paired with images—a jellyfish for lip-based words and a crab for tongue-tip words. Babies were presented with a recorded word alongside its corresponding image.
Since infants cannot verbally express what they understand, researchers relied on visual associations to assess learning. The cartoon images helped determine whether the babies could connect specific speech sounds with certain visual cues.
After the initial learning phase, the experiment took an unexpected turn. Instead of hearing the words, babies were shown silent videos of a person speaking new words from the same mini-languages. In some cases, the facial movements matched the previously learned associations; in others, they did not.
By tracking how long the infants looked at each video—a widely used method in developmental psychology—researchers found that babies spent more time watching videos where the face matched what they had previously learned. This indicated they were not just passively listening but were actively learning and linking speech sounds with their corresponding visual movements.
Laying the foundation for language development
These findings suggest that, even before they begin fine-tuning their understanding of native-language sounds, infants can connect auditory and visual speech cues. This early pattern recognition forms the foundation for later language development, potentially influencing how they produce and comprehend words as they grow.
The study also raises exciting new questions:
- Can infants at this age differentiate even more subtle speech contrasts, such as voiced versus unvoiced sounds (e.g., “b” versus “p”)?
- How does growing up in a bilingual environment impact this early ability?
- Could this skill help babies recognize patterns in entirely new languages later in life?
By further investigating these questions, researchers hope to uncover deeper insights into how human language learning begins—offering new possibilities for early childhood education and speech development interventions.
For now, one thing is clear: long before they say their first words, babies are already hard at work decoding the sounds of the world around them.