Russian Scientists Reconstruct Dynamics of Brain Neuron Model Using Neural Network
Researchers from HSE University in Nizhny Novgorod have shown that a neural network can reconstruct the dynamics of a brain neuron model using just a single set of measurements, such as recordings of its electrical activity. The developed neural network was trained to reconstruct the system's full dynamics and predict its behaviour under changing conditions. This method enables the investigation of complex biological processes, even when not all necessary measurements are available. The study has been published in Chaos, Solitons & Fractals.
Neurons are cells that enable the brain to process information and transmit signals. They communicate through electrical impulses, which either activate neighbouring neurons or slow them down. Each neuron has a membrane that allows charged particles, known as ions, to pass through channels in the membrane, generating electrical impulses.

Mathematical models are used to study the function of neurons. These models are often based on the Hodgkin-Huxley approach, which allows for the construction of relatively simple models but requires a large number of parameters and calculations. To predict a neuron's behaviour, several parameters and characteristics are typically measured, including membrane voltage, ion currents, and the state of the cell channels. Researchers from HSE University and the Saratov Branch of the Kotelnikov Institute of Radioengineering and Electronics of the Russian Academy of Sciences have demonstrated the possibility of considering changes in a single control parameter—the neuron's membrane electrical potential—and using a neural network to reconstruct the missing data.
The proposed method consisted of two steps. First, changes in a neuron's potential over time were analysed. This data was then fed into a neural network—a variational autoencoder—that identified key patterns, discarded irrelevant information, and generated a set of characteristics describing the neuron's state. Second, a different type of neural network—neural network mapping—used these characteristics to predict the neuron's future behaviour. The neural network effectively took on the functions of a Hodgkin-Huxley model, but instead of relying on complex equations, it was trained on the data.

'With the advancement of mathematical and computational methods, traditional approaches are being revisited, which not only helps improve them but can also lead to new discoveries. Models reconstructed from data are typically based on low-order polynomial equations, such as the 4th or 5th order. These models have limited nonlinearity, meaning they cannot describe highly complex dependencies without increasing the error,' explains Pavel Kuptsov, Leading Research Fellow at the Faculty of Informatics, Mathematics, and Computer Science of HSE University in Nizhny Novgorod. 'The new method uses neural networks in place of polynomials. Their nonlinearity is governed by sigmoids, smooth functions ranging from 0 to 1, which correspond to polynomial equations (Taylor series) of infinite order. This makes the modelling process more flexible and accurate.’
Typically, a complete set of parameters is required to simulate a complex system, but obtaining this in real-world conditions can be challenging. In experiments, especially in biology and medicine, data is often incomplete or noisy. The scientists demonstrated by their approach that using a neural network makes it possible to reconstruct missing values and predict the system's behaviour, even with a limited amount of data.
'We take just one row of data, a single example of behaviour, train a model on it, and incorporate a control parameter into it. Imagine it as a rotating switch that can be turned to observe different behaviours. After training, if we start adjusting the switch—ie, changing this parameter—we will observe that the model reproduces various types of behaviours that are characteristic of the original system,' explains Pavel Kuptsov.
During the simulation, the neural network not only replicated the system modes it was trained on but also identified new ones. One of these involves the transition from a series of frequent pulses to single bursts. Such transitions occur when the parameters change, yet the neural network detected them independently, without having seen such examples in the data it was trained on. This means that the neural network does not just memorise examples; it actually recognises hidden patterns.
'It is important that the neural network can identify new patterns in the data,’ says Natalya Stankevich, Leading Research Fellow at the Faculty of Informatics, Mathematics, and Computer Science of HSE University in Nizhny Novgorod. 'It identifies connections that are not explicitly represented in the training sample and draws conclusions about the system's behaviour under new conditions.'
The neural network is currently operating on computer-generated data. In the future, the researchers plan to apply it to real experimental data. This opens up opportunities for studying complex dynamic processes where it is impossible to anticipate all potential scenarios in advance.
The study was carried out as part of HSE University's Mirror Laboratories project and supported by a grant from the Russian Science Foundation.
Pavel Kuptsov
See also:
'Our Goal Is Not to Determine Which Version Is Correct but to Explore the Variability'
The International Linguistic Convergence Laboratory at the HSE Faculty of Humanities studies the processes of convergence among languages spoken in regions with mixed, multiethnic populations. Research conducted by linguists at HSE University contributes to understanding the history of language development and explores how languages are perceived and used in multilingual environments. George Moroz, head of the laboratory, shares more details in an interview with the HSE News Service.
Slim vs Fat: Overweight Russians Earn Less
Overweight Russians tend to earn significantly less than their slimmer counterparts, with a 10% increase in body mass index (BMI) associated with a 9% decrease in wages. These are the findings made by Anastasiia Deeva, lecturer at the HSE Faculty of Economic Sciences and intern researcher in Laboratory of Economic Research in Public Sector. The article has been published in Voprosy Statistiki.
Scientists Reveal Cognitive Mechanisms Involved in Bipolar Disorder
An international team of researchers including scientists from HSE University has experimentally demonstrated that individuals with bipolar disorder tend to perceive the world as more volatile than it actually is, which often leads them to make irrational decisions. The scientists suggest that their findings could lead to the development of more accurate methods for diagnosing and treating bipolar disorder in the future. The article has been published in Translational Psychiatry.
Scientists Develop AI Tool for Designing Novel Materials
An international team of scientists, including researchers from HSE University, has developed a new generative model called the Wyckoff Transformer (WyFormer) for creating symmetrical crystal structures. The neural network will make it possible to design materials with specified properties for use in semiconductors, solar panels, medical devices, and other high-tech applications. The scientists will present their work at ICML, a leading international conference on machine learning, on July 15 in Vancouver. A preprint of the paper is available on arxiv.org, with the code and data released under an open-source license.
HSE Linguists Study How Bilinguals Use Phrases with Numerals in Russian
Researchers at HSE University analysed over 4,000 examples of Russian spoken by bilinguals for whom Russian is a second language, collected from seven regions of Russia. They found that most non-standard numeral constructions are influenced not only by the speakers’ native languages but also by how frequently these expressions occur in everyday speech. For example, common phrases like 'two hours' or 'five kilometres’ almost always match the standard literary form, while less familiar expressions—especially those involving the numerals two to four or collective forms like dvoe and troe (used for referring to people)—often differ from the norm. The study has been published in Journal of Bilingualism.
Overcoming Baby Duck Syndrome: How Repeated Use Improves Acceptance of Interface Updates
Users often prefer older versions of interfaces due to a cognitive bias known as the baby duck syndrome, where their first experience with an interface becomes the benchmark against which all future updates are judged. However, an experiment conducted by researchers from HSE University produced an encouraging result: simply re-exposing users to the updated interface reduced the bias and improved their overall perception of the new version. The study has been published in Cognitive Processing.
Mathematicians from HSE Campus in Nizhny Novgorod Prove Existence of Robust Chaos in Complex Systems
Researchers from the International Laboratory of Dynamical Systems and Applications at the HSE Campus in Nizhny Novgorod have developed a theory that enables a mathematical proof of robust chaotic dynamics in networks of interacting elements. This research opens up new possibilities for exploring complex dynamical processes in neuroscience, biology, medicine, chemistry, optics, and other fields. The study findings have been accepted for publication in Physical Review Letters, a leading international journal. The findings are available on arXiv.org.
Mathematicians from HSE University–Nizhny Novgorod Solve 57-Year-Old Problem
In 1968, American mathematician Paul Chernoff proposed a theorem that allows for the approximate calculation of operator semigroups, complex but useful mathematical constructions that describe how the states of multiparticle systems change over time. The method is based on a sequence of approximations—steps which make the result increasingly accurate. But until now it was unclear how quickly these steps lead to the result and what exactly influences this speed. This problem has been fully solved for the first time by mathematicians Oleg Galkin and Ivan Remizov from the Nizhny Novgorod campus of HSE University. Their work paves the way for more reliable calculations in various fields of science. The results were published in the Israel Journal of Mathematics (Q1).
Large Language Models No Longer Require Powerful Servers
Scientists from Yandex, HSE University, MIT, KAUST, and ISTA have made a breakthrough in optimising LLMs. Yandex Research, in collaboration with leading science and technology universities, has developed a method for rapidly compressing large language models (LLMs) without compromising quality. Now, a smartphone or laptop is enough to work with LLMs—there's no need for expensive servers or high-powered GPUs.
AI to Enable Accurate Modelling of Data Storage System Performance
Researchers at the HSE Faculty of Computer Science have developed a new approach to modelling data storage systems based on generative machine learning models. This approach makes it possible to accurately predict the key performance characteristics of such systems under various conditions. Results have been published in the IEEE Access journal.