• A
  • A
  • A
  • ABC
  • ABC
  • ABC
  • А
  • А
  • А
  • А
  • А
Regular version of the site

Scientists Reveal How Language Supports Complex Cognitive Processing in the Brain

Scientists Reveal How Language Supports Complex Cognitive Processing in the Brain

© iStock

Valeria Vinogradova, a researcher at HSE University, together with British colleagues, studied how language proficiency affects cognitive processing in deaf adults. The study showed that higher language proficiency—regardless of whether the language is signed or spoken—is associated with higher activity and stronger functional connectivity within the brain network responsible for cognitive task performance. The findings have been published in Cerebral Cortex.

The relationship between language and complex cognitive processes has long been discussed in psychology and neuroscience. Language is not only important for communication but also helps planning, maintaining rules in memory, switching between tasks, and controlling one's actions. This applies to both spoken language and sign language.

Children of deaf signing parents acquire sign language at the same level and within the same time frame as their hearing peers acquire spoken language. However, many deaf children are born to hearing parents and may not receive sufficient language exposure early in life, during the critical period of language development. Delayed language acquisition can subsequently affect executive functions, leading to a reduced ability to perform tasks such as planning, action control, and maintaining goals in working memory. Studies indicate that these difficulties stem from language background and early language experience rather than from deafness itself.

Valeria Vinogradova, Research Fellow at the HSE Centre for Language and Brain, together with British colleagues, studied how current levels of language proficiency in deaf adults relate to the functioning of large-scale brain networks that support complex cognition. For the first time, the researchers applied a study design making it possible to examine the role of language experience independently of language modality—whether sign or spoken.

The study included data from 24 individuals with congenital or early-onset deafness and 20 hearing adults. The participant groups were matched for age, gender, non-verbal reasoning skills, and visuospatial memory span. The deaf participants showed substantial differences in language experience, providing the necessary variability in language proficiency within this group: some considered English their first language and some grew up in families who communicate in British Sign Language (BSL). To assess participants’ language skills independently of modality—whether spoken or sign language—the participants completed grammaticality judgment tasks in both English and BSL. The higher score from the two tasks was used to derive a modality-independent measure of language proficiency.

Next, participants performed tasks inside an MRI scanner, allowing the researchers to study the brain regions that are involved in task performance. Two experiments were selected for analysis: one assessing working memory and the other planning. Each experiment included two tasks. The first was the main task, which required substantial cognitive effort, while the second served as a control: it was visually similar but did not impose the same cognitive load. In the working memory experiment, participants were required to memorise the locations of objects on the screen and then choose a picture that represented all objects at once. In the planning experiment, they mentally constructed sequences of actions in a computerised version of the classic Tower of London task, which is widely used in psychology to study planning.

The researchers focused on two major brain networks. The first was the task-positive network, which includes frontal and parietal regions that are actively engaged when a person is performing demanding tasks and maintaining attention. The second was the task-negative, or default mode, network, which typically shows reduced activity during task performance and becomes more active when a person is at rest and not engaged in a specific task.

These two systems often operate in opposite phases: when the task-positive network is highly engaged, the task-negative network is typically suppressed. During the experiments, the researchers assessed two parameters: first, the degree to which each network was activated—or, conversely, deactivated—during task performance; and second, the functional connectivity within each network, ie how well coordinated its regions were.

Behaviourally, deaf and hearing participants performed the tasks similarly: in the more complex working memory and planning tasks, both groups made more errors than in the easier control tasks. The fMRI data also revealed the expected pattern: during more demanding tasks, the brain engaged the networks responsible for attention and control more strongly, while suppressing the network typically active at rest. Notably, the difference between complex and simple conditions in the default mode, or task-negative network, was more pronounced in hearing participants than in deaf participants.

Valeria Vinogradova

'The key moment for us was the opportunity to work with data from deaf participants. The greater variability in language proficiency within this group allowed us to detect effects that are often less apparent when studying hearing individuals,' comments Valeria Vinogradova, Research Fellow at the HSE Centre for Language and Brain and one of the study’s authors.

The researchers were able to assess how language proficiency supports performance in non-linguistic tasks. In deaf adults, language skills were associated with the brain measures during working memory: participants with higher language proficiency scores showed higher activity and functional connectivity in the regions involved in maintaining information in mind. In other conditions—including the planning task and the control tasks—no such relationship was observed.

The study results show that language plays a crucial role in shaping and supporting cognitive function. Importantly, it does not matter whether a person is more proficient in sign language or spoken language—language in any modality supports the development of thinking.

The authors also emphasise that cognitive skills may be shaped by other environmental factors, such as socio-economic conditions, early interactions with parents, access to education, and the extent of a child’s engagement in communication. The findings of the paper show that functional MRI studies can detect these influences even when they are not evident behaviourally.

In the future, the researchers plan to continue studying individuals of different ages and with diverse language profiles. This work will help better understand how language, cognition, and sensory experience interact across the lifespan, and the role that early developmental conditions play in shaping these processes.

See also:

Russian Scientists Propose Method to Speed Up Microwave Filter Design

Researchers at HSE MIEM, in collaboration with colleagues from the Moscow Technical University of Communications and Informatics (MTUCI), have implemented a novel approach to designing microwave filters—generative synthesis using machine learning tools. The proposed method reduces the filter development cycle from several days to just a few minutes and in the future could be applied to the design of other microwave electronic devices. The results were presented at the IEEE International Conference '2026 Systems of Signals Generating and Processing in the Field of on Board Communications.'

Scientists Find That Only Technological Innovations Consistently Advance Environmental Sustainability

Renewable energy and labour productivity do not always contribute to environmental sustainability. Technological innovation is the only factor that consistently has a positive effect. This is the conclusion reached by an international team of researchers, including Natalia Veselitskaya, Leading Research Fellow at the HSE ISSEK Foresight Centre. The study has been published in Sustainable Development.

HSE Researchers Train Neural Network to Predict Protein–Protein Interactions More Accurately

Scientists at the AI and Digital Science Institute of the HSE Faculty of Computer Science have developed a model capable of predicting protein–protein interactions with 95% accuracy. GSMFormer-PPI integrates three types of protein data (including information about protein surface properties) to analyse relationships between proteins, rather than simply combining datasets as in previous models. The solution could accelerate the discovery of disease molecular mechanisms, biomarkers, and potential therapeutic targets. The paper has been published in Scientific Reports.

HSE Scientists Uncover Mechanism Behind Placental Lipid Metabolism Disorders in Preeclampsia

Scientists at HSE University have discovered that in preeclampsia—one of the most severe complications of pregnancy—the placenta remodels its lipid metabolism, reducing its own cholesterol synthesis while increasing cholesterol transfer to the foetus. This compensatory mechanism helps sustain foetal nutrition but accelerates placental deterioration and may lead to preterm birth. The study findings have been published in Frontiers in Molecular Biosciences.

HSE Experts Reveal Low Accuracy of Technology Forecasts in Transportation

HSE researchers evaluated the accuracy of technology forecasts in the transportation sector over the past 50 years and found that the average accuracy rate does not exceed 25%, with the lowest accuracy observed in aviation and rail transport. According to the scientists, this is due to limitations of the forecasting method and the inherent complexities of the sector. The study findings have been published in Technological Forecasting and Social Change.

Wearable Device Data and Saliva Biomarkers Help Assess Stress Resilience

A team of scientists, including researchers from HSE University, has proposed a method for assessing stress resilience using physiological markers derived from wearable devices and saliva samples. The participants who adapted better to stress showed higher heart rate variability, higher zinc concentrations in saliva, and lower potassium levels.  The findings were published in the Journal of Molecular Neuroscience.

When Circumstances Are Stronger Than Habits: How Financial Stress Affects Smoking Cessation

HSE researchers have found that the likelihood of quitting smoking rises with increasing financial struggles. While low levels of financial difficulties do not affect smoking behaviour, moderate financial stress can increase the probability of quitting by 13% to 21%. Responses to high financial stress differ by gender: men are almost 1.5 times more likely to give up cigarettes than under normal conditions, whereas no significant effect is observed on women’s decisions to quit smoking. These conclusions are based on data from the Russia Longitudinal Monitoring Survey (RLMS-HSE) for 2000–2023 and have been published in Monitoring of Public Opinion: Economic and Social Changes

HSE Researchers Propose New Method of Verbal Fluency Analysis for Early Detection of Cognitive Impairment

Researchers from the HSE Center for Language and Brain and the Mental Health Research Centre have proposed a new method of linguistic analysis that enables the distinction between normal and pathological ageing. Using this approach, they showed that patterns in patients’ word choices during verbal fluency tests allow clinicians to more accurately differentiate clinically significant impairments from subjective memory complaints. Incorporating this type of analysis into clinical practice could improve the accuracy of early dementia diagnosis. The results have been published in Applied Neuropsychology: Adult.

How the Brain Processes a Word: HSE Researchers Compare Reading Routes in Adults and Children

Researchers from the HSE Center for Language and Brain used magnetoencephalography to study how the brains of adults and children respond to words during reading. They showed that in children the brain takes longer to process words that are frequently used in everyday speech, while rare words and pseudowords are processed in the same way—slowly and in parts. With age, the system is reorganised: high-frequency words shift to a fast route, whereas new letter combinations are still analysed slowly. The study was published in the journal Psychophysiology.

How Neural Networks Detect and Interpret Wordplay: New Insights from HSE Researchers

An international team including researchers from the HSE Faculty of Computer Science has presented KoWit-24, an annotated dataset of 2,700 Russian-language Kommersant news headlines containing wordplay. The dataset enables an assessment of how artificial intelligence detects and interprets wordplay. Experiments with five large language models show that even advanced systems still make mistakes, and that interpreting wordplay is more challenging for them than detecting it. The results were presented at the RANLP conference; the paper is available on Arxiv.org, and the dataset and the code for reproducing the experiments are available on GitHub.