We often think of babies as blank canvases with little ability to learn during the first few weeks of life. But babies actually start processing language and speech incredibly early. Even while in the womb, they learn to discern voices, along with some speech sounds. At birth, they already prefer speech sounds over other types of non-language sounds.
But exactly how the baby brain learns to process complex language sounds is still a bit of a mystery. In our recent study, published in Nature Human Behaviour, we uncovered details of this mindbogglingly speedy learning process – starting in the first few hours of birth.
We collaborated with a neonatal research team in China, who fitted babies’ heads with a small cap covered in sophisticated light emitting devices designed to measure tiny changes in oxygen levels in the babies’ brains. Detectors in the cap could help us determine which areas of the brain were active over time.
The procedure, which is entirely safe and painless, was carried out within three hours of the babies being born. It only required the baby to wear a small elastic cap and to shine minute infrared lights (essentially heat radiation) through the head. This fits with the common practice in many cultures to wrap newborns in a close-fitting blanket to pacify them – easing the transition from the comfort of the womb to the wild world of autonomous physical existence.
Within three hours of being born, all babies were exposed to pairs of sounds that most researchers would predict they should be able to distinguish. This included vowels (such as “o”) and these same vowels played backwards. Usually, reversed speech is very different from normal (forward) speech, but in the case of isolated vowels, the difference is subtle. In fact, in our study, we found that adult listeners could only distinguish between the two instances 70% of the time.
What surprised us is that newborns failed to differentiate between forwards and backwards vowels immediately after birth, because we found no difference between brain signals collected in each case in the first three hours of birth. In hindsight, we should not have been so surprised considering how subtle the difference was.
However, we were stunned to discover that after listening to these sounds for five hours, newborns started differentiating between these forwards and backwards vowels. First, their response to forwards vowels became faster than to backwards vowels. And after a further two hours, during which they mostly slept, their brain responded to forwards vowels not only faster but also more strongly compared with babies trained with different vowels or babies who remained in silence.
This means that in the first day of life, it takes only a few hours for the baby’s brain to learn the subtle difference between natural and slightly unnatural speech sounds.
We were further able to see that brain regions of the superior temporal lobe (a part of the brain associated with auditory processing) and of the frontal cortex (involved in planning complex movements) were involved in processing the vowel sounds, especially in the left hemisphere. That’s similar to the pattern that underpins language comprehension and production in adults.
And even more fascinating, we were able to detect cross-talk (communication between different brain areas) between these regions in both the group of baby participants that were exposed to speech sounds, but not in those who had not experienced any training. In other words, neurons of the trained babies were having a “conversation” across the brain in a way that was not seen in babies who remained in silence during the same period.
Newborns probably benefit directly from being talked to from the very first moments they have left the womb. Clearly, “nurture” – the changing of the mind by the environment – starts on day one.
Babies aren’t pre-programmed
We can also consider these findings in the context of a trendy concept in neuroscience today, namely embodiment theory. Embodiment is the idea that our thoughts and mental operations are not pre-programmed or operate mysteriously from some inherited, genetic code but rather build upon direct experience of the world around us, through the sensory channels that start operating from birth, such as hearing, seeing, tasting, smelling and touching.
Even though our brain has a predisposition to learn based on its organisation and function defined by the genetic code inherited from our parents, it is also able to feel the environment as soon as it is born, and this immediately helps our internal representations of the world around us.
I would suggest that you not only talk to your baby but also share with them all sorts of sensory experiences of the world as soon as they are in your arms – be it exposing them to music, letting them smell flowers or showing them objects or views they’ve never seen before. By encouraging more varied experiences, you give the baby brain new avenues to grow and develop, and probably more creative abilities for the future.