Real-life speech production and perception have a shared premotor-cortical substrate

Geometric figures in magenta show the parts of the frontal brain surface that showed activity during both speech production and perception in all subjects. The other symbols depict the other functional regions: Body parts in red and in blue are respectively motor and sensory areas of the left pericentral cortex, and the yellow symbol depicts an area essential for speech production.

Communicating through spoken language is fundamental for our every-day social life. How do our brains enable this remarkable capability?

Experimental research in the neuroscience of language has made a lot of progress over the last decades. However, little is still know about the neural basis of language in non-experimental, real-life conditions. A new article on this topic has appeared this month in the journal Scientific Reports of the Nature series. It is "Real-life speech production and perception have a shared premotor-cortical substrate," authored by an interdisciplinary team of researchers from Freiburg - Olga Glanz (Iljina), Johanna Derix, Rajbir Kaur, Andreas Schulze-Bonhage, Peter Auer, Ad Aertsen, and Tonio Ball.

In this article, the researchers have addressed a long-standing controversy in the neurobiology of language: Whether or not the parts of the brain surface involved in articulation are also active when we listen to speech. Over decades, scholars have been divided in two camps regarding this matter: Some observe such activation and believe that regions involved in articulation are essential for language comprehension. Others do not find such activation and, accordingly, they that it is exotic, if it is there at all. Curiously, both camps agree that the absence or presence of such activity is can be due to the lack of ecological validity of neuroscientific experiments. Observations of neuronal activity as it occurs in real life are therefore needed.

In a unique design which allows taking neuroscientific research to non-experimental, out-of-the-lab conditions of human communication, Glanz and colleagues were able, for the first time, to address these concerns. They analysed speech and the accompanying neuronal data collected with the help of intracranial EEG recordings during real-life conversations.

The researchers were able to show that articulatory regions are indeed robustly activated during real-life speech perception. Interestingly, this activity was there during the perception of speech but not when the subjects were listening to acoustic noise. The "speaking" parts of the brain are thus doing something also when we listen to speech. The figure above is an illustration of this interesting part of the brain identified by Glanz and colleagues.

The full paper is available at: https://rdcu.be/VDs2