Rulkov map

Last updated
Time series of Rulkov map showing three different dynamical regimes Rulkov Time Series.jpg
Time series of Rulkov map showing three different dynamical regimes

The Rulkov map is a two-dimensional iterated map used to model a biological neuron. It was proposed by Nikolai F. Rulkov in 2001. [1] The use of this map to study neural networks has computational advantages because the map is easier to iterate than a continuous dynamical system. This saves memory and simplifies the computation of large neural networks.

Contents

The model

The Rulkov map, with as discrete time, can be represented by the following dynamical equations:

where represents the membrane potential of the neuron. The variable in the model is a slow variable due to a very small value of . Unlike variable , variable does not have explicit biological meaning, though some analogy to gating variables can be drawn. [2] The parameter can be thought of as an external dc current given to the neuron and is a nonlinearity parameter of the map. Different combinations of parameters and give rise to different dynamical states of the neuron like resting, tonic spiking and chaotic bursts. The chaotic bursting is enabled above

Analysis

The dynamics of the Rulkov map can be analyzed by analyzing the dynamics of its one dimensional fast submap. Since the variable evolves very slowly, for moderate amount of time it can be treated as a parameter with constant value in the variable's evolution equation (which we now call as one dimensional fast submap because as compared to , is a fast variable). Depending on the value of , this submap can have either one or three fixed points. One of these fixed points is stable, another is unstable and third may change the stability. [3] As increases, two of these fixed points (stable one and unstable one) merge and disappear by saddle-node bifurcation.

Coupling

Coupling of two neurons has been investigated by Irina Bashkirtseva and Alexander Pisarchik who explored transitions between stationary, periodic, quasiperiodic, and chaotic regimes. [4] They also addresses the additional consequences of random disturbances on this system, leading to noise-induced transitions between periodic and chaotic stochastic oscillations. [5]

Other applications

Adaptations of the Rulkov map have found applications in labor and industrial economics, particularly in the realm of corporate dynamics. [6] The proposed framework leverages synchronization and chaos regularization to account for dynamic transitions among multiple equilibria, incorporate skewness and idiosyncratic elements, and unveil the influence of effort on corporate profitability. The results are substantiated through empirical validation with real-world data. [6] Orlando and Bufalo introduced a deterministic model based on the Rulkov map, [7] effectively modeling volatility fluctuations in corporate yields and spreads, even during distressed periods like COVID-19. Comparing it to the ARIMA-EGARCH model, designed for handling various volatility aspects, both models yield comparable results. Nevertheless, the deterministic nature of the Rulkov map model may provide enhanced explanatory capabilities. [8]

Other applications of the Rulkov map include memristors, [9] [10] financial markets, [11] [12] biological systems, [13] etc.

See also

Related Research Articles

<span class="mw-page-title-main">Butterfly effect</span> Idea that small causes can have large effects

In chaos theory, the butterfly effect is the sensitive dependence on initial conditions in which a small change in one state of a deterministic nonlinear system can result in large differences in a later state.

<span class="mw-page-title-main">Chaos theory</span> Field of mathematics and science based on non-linear systems and initial conditions

Chaos theory is an interdisciplinary area of scientific study and branch of mathematics focused on underlying patterns and deterministic laws of dynamical systems that are highly sensitive to initial conditions, and were once thought to have completely random states of disorder and irregularities. Chaos theory states that within the apparent randomness of chaotic complex systems, there are underlying patterns, interconnection, constant feedback loops, repetition, self-similarity, fractals, and self-organization. The butterfly effect, an underlying principle of chaos, describes how a small change in one state of a deterministic nonlinear system can result in large differences in a later state. A metaphor for this behavior is that a butterfly flapping its wings in Texas can cause a tornado in Brazil.

A complex system is a system composed of many components which may interact with each other. Examples of complex systems are Earth's global climate, organisms, the human brain, infrastructure such as power grid, transportation or communication systems, complex software and electronic systems, social and economic organizations, an ecosystem, a living cell, and ultimately the entire universe.

In mathematics and science, a nonlinear system is a system in which the change of the output is not proportional to the change of the input. Nonlinear problems are of interest to engineers, biologists, physicists, mathematicians, and many other scientists since most systems are inherently nonlinear in nature. Nonlinear dynamical systems, describing changes in variables over time, may appear chaotic, unpredictable, or counterintuitive, contrasting with much simpler linear systems.

An artificial neuron is a mathematical function conceived as a model of biological neurons in a neural network. Artificial neurons are the elementary units of artificial neural networks. The artificial neuron receives one or more inputs and sums them to produce an output. Usually, each input is separately weighted, and the sum is often added to a term known as a bias, before being passed through a non-linear function known as an activation function or transfer function. The transfer functions usually have a sigmoid shape, but they may also take the form of other non-linear functions, piecewise linear functions, or step functions. They are also often monotonically increasing, continuous, differentiable and bounded. Non-monotonic, unbounded and oscillating activation functions with multiple zeros that outperform sigmoidal and ReLU-like activation functions on many tasks have also been recently explored. The thresholding function has inspired building logic gates referred to as threshold logic; applicable to building logic circuits resembling brain processing. For example, new devices such as memristors have been extensively used to develop such logic in recent times.

<span class="mw-page-title-main">Takens's theorem</span> Conditions under which a chaotic system can be reconstructed by observation

In the study of dynamical systems, a delay embedding theorem gives the conditions under which a chaotic dynamical system can be reconstructed from a sequence of observations of the state of that system. The reconstruction preserves the properties of the dynamical system that do not change under smooth coordinate changes, but it does not preserve the geometric shape of structures in phase space.

A Hopfield network is a form of recurrent artificial neural network and a type of spin glass system popularised by John Hopfield in 1982 as described by Shun'ichi Amari in 1972 and by Little in 1974 based on Ernst Ising's work with Wilhelm Lenz on the Ising model. Hopfield networks serve as content-addressable ("associative") memory systems with binary threshold nodes, or with continuous variables. Hopfield networks also provide a model for understanding human memory.

<span class="mw-page-title-main">Granger causality</span> Statistical hypothesis test for forecasting

The Granger causality test is a statistical hypothesis test for determining whether one time series is useful in forecasting another, first proposed in 1969. Ordinarily, regressions reflect "mere" correlations, but Clive Granger argued that causality in economics could be tested for by measuring the ability to predict the future values of a time series using prior values of another time series. Since the question of "true causality" is deeply philosophical, and because of the post hoc ergo propter hoc fallacy of assuming that one thing preceding another can be used as a proof of causation, econometricians assert that the Granger test finds only "predictive causality". Using the term "causality" alone is a misnomer, as Granger-causality is better described as "precedence", or, as Granger himself later claimed in 1977, "temporally related". Rather than testing whether Xcauses Y, the Granger causality tests whether X forecastsY.

<span class="mw-page-title-main">Chua's circuit</span> Electronic circuit that behaves chaotically

Chua's circuit is a simple electronic circuit that exhibits classic chaotic behavior. This means roughly that it is a "nonperiodic oscillator"; it produces an oscillating waveform that, unlike an ordinary electronic oscillator, never "repeats". It was invented in 1983 by Leon O. Chua, who was a visitor at Waseda University in Japan at that time. The ease of construction of the circuit has made it a ubiquitous real-world example of a chaotic system, leading some to declare it "a paradigm for chaos".

Bursting, or burst firing, is an extremely diverse general phenomenon of the activation patterns of neurons in the central nervous system and spinal cord where periods of rapid action potential spiking are followed by quiescent periods much longer than typical inter-spike intervals. Bursting is thought to be important in the operation of robust central pattern generators, the transmission of neural codes, and some neuropathologies such as epilepsy. The study of bursting both directly and in how it takes part in other neural phenomena has been very popular since the beginnings of cellular neuroscience and is closely tied to the fields of neural synchronization, neural coding, plasticity, and attention.

In dynamical systems theory, a period-doubling bifurcation occurs when a slight change in a system's parameters causes a new periodic trajectory to emerge from an existing periodic trajectory—the new one having double the period of the original. With the doubled period, it takes twice as long for the numerical values visited by the system to repeat themselves.

<span class="mw-page-title-main">Memristor</span> Nonlinear two-terminal fundamental circuit element

A memristor is a non-linear two-terminal electrical component relating electric charge and magnetic flux linkage. It was described and named in 1971 by Leon Chua, completing a theoretical quartet of fundamental electrical components which also comprises the resistor, capacitor and inductor.

<span class="mw-page-title-main">Biological neuron model</span> Mathematical descriptions of the properties of certain cells in the nervous system

Biological neuron models, also known as spiking neuron models, are mathematical descriptions of neurons. In particular, these models describe how the voltage potential across the cell membrane changes over time. In an experimental setting, stimulating neurons with an electrical current generates an action potential, that propagates down the neuron's axon. This axon can branch out and connect to a large number of downstream neurons at sites called synapses. At these synapses, the spike can cause release of a biochemical substance (neurotransmitter), which in turn can change the voltage potential of downstream neurons, potentially leading to spikes in those downstream neurons, thus propagating the signal. As many as 85% of neurons in the neocortex, the outermost layer of the mammalian brain, consists of excitatory pyramidal neurons, and each pyramidal neuron receives tens of thousands of inputs from other neurons. Thus, spiking neurons are a major information processing unit of the nervous system.

<span class="mw-page-title-main">Hindmarsh–Rose model</span> Of the spiking-bursting behavior of a neuron

The Hindmarsh–Rose model of neuronal activity is aimed to study the spiking-bursting behavior of the membrane potential observed in experiments made with a single neuron. The relevant variable is the membrane potential, x(t), which is written in dimensionless units. There are two more variables, y(t) and z(t), which take into account the transport of ions across the membrane through the ion channels. The transport of sodium and potassium ions is made through fast ion channels and its rate is measured by y(t), which is called the spiking variable. z(t) corresponds to an adaptation current, which is incremented at every spike, leading to a decrease in the firing rate. Then, the Hindmarsh–Rose model has the mathematical form of a system of three nonlinear ordinary differential equations on the dimensionless dynamical variables x(t), y(t), and z(t). They read:

A coupled map lattice (CML) is a dynamical system that models the behavior of nonlinear systems. They are predominantly used to qualitatively study the chaotic dynamics of spatially extended systems. This includes the dynamics of spatiotemporal chaos where the number of effective degrees of freedom diverges as the size of the system increases.

The dynamical systems approach to neuroscience is a branch of mathematical biology that utilizes nonlinear dynamics to understand and model the nervous system and its functions. In a dynamical system, all possible states are expressed by a phase space. Such systems can experience bifurcation as a function of its bifurcation parameters and often exhibit chaos. Dynamical neuroscience describes the non-linear dynamics at many levels of the brain from single neural cells to cognitive processes, sleep states and the behavior of neurons in large-scale neuronal simulation.

<span class="mw-page-title-main">Rectifier (neural networks)</span> Activation function

In the context of artificial neural networks, the rectifier or ReLU activation function is an activation function defined as the positive part of its argument:

In biology exponential integrate-and-fire models are compact and computationally efficient nonlinear spiking neuron models with one or two variables. The exponential integrate-and-fire model was first proposed as a one-dimensional model. The most prominent two-dimensional examples are the adaptive exponential integrate-and-fire model and the generalized exponential integrate-and-fire model. Exponential integrate-and-fire models are widely used in the field of computational neuroscience and spiking neural networks because of (i) a solid grounding of the neuron model in the field of experimental neuroscience, (ii) computational efficiency in simulations and hardware implementations, and (iii) mathematical transparency.

Phase reduction is a method used to reduce a multi-dimensional dynamical equation describing a nonlinear limit cycle oscillator into a one-dimensional phase equation. Many phenomena in our world such as chemical reactions, electric circuits, mechanical vibrations, cardiac cells, and spiking neurons are examples of rhythmic phenomena, and can be considered as nonlinear limit cycle oscillators.

<span class="mw-page-title-main">Chialvo map</span>

The Chialvo map is a two-dimensional map proposed by Dante R. Chialvo in 1995 to describe the generic dynamics of excitable systems. The model is inspired by Kunihiko Kaneko's Coupled map lattice (CML) numerical approach which considers time and space as discrete variables but state as a continuous one. Later on Rulkov popularized a similar approach. By using only three parameters the model is able to efficiently mimic generic neuronal dynamics in computational simulations, as single elements or as parts of inter-connected networks.

References

  1. "Modelling of spiking-bursting neural behavior using two dimensional map",
  2. Igor Franovic´; Vladimir Miljkovic´ (2011). "The effects of synaptic time delay on motifs of chemically coupled Rulkov model neurons". Commun Nonlinear Sci Numer Simulat. 16 (2): 623–633. Bibcode:2011CNSNS..16..623F. doi:10.1016/j.cnsns.2010.05.007.{{cite journal}}: CS1 maint: multiple names: authors list (link)
  3. N.F. Rulkov (2001). "Regularization of Synchronized Chaotic Bursts". Physical Review Letters. 86 (1): 183–186. arXiv: nlin/0011028 . Bibcode:2001PhRvL..86..183R. doi:10.1103/physrevlett.86.183. PMID   11136124. S2CID   7016788.
  4. pubs.aip.org https://pubs.aip.org/aip/acp/article/2172/1/070004/598377/Variability-and-effect-of-noise-on-the-corporate . Retrieved 2023-10-01.{{cite web}}: Missing or empty |title= (help)
  5. pubs.aip.org https://pubs.aip.org/aip/acp/article/2172/1/070004/598377/Variability-and-effect-of-noise-on-the-corporate . Retrieved 2023-10-01.{{cite web}}: Missing or empty |title= (help)
  6. 1 2 Orlando, Giuseppe (2022-06-01). "Simulating heterogeneous corporate dynamics via the Rulkov map". Structural Change and Economic Dynamics. 61: 32–42. doi:10.1016/j.strueco.2022.02.003. ISSN   0954-349X.
  7. Orlando, Giuseppe; Bufalo, Michele (2022-06-01). "Modelling bursts and chaos regularization in credit risk with a deterministic nonlinear model". Finance Research Letters. 47: 102599. doi:10.1016/j.frl.2021.102599. ISSN   1544-6123.
  8. Orlando, Giuseppe; Bufalo, Michele (2022-06-01). "Modelling bursts and chaos regularization in credit risk with a deterministic nonlinear model". Finance Research Letters. 47: 102599. doi:10.1016/j.frl.2021.102599. ISSN   1544-6123.
  9. Lu, Yanmei; Wang, Chunhua; Deng, Quanli (2022-10-02). "Rulkov neural network coupled with discrete memristors". Network: Computation in Neural Systems. 33 (3–4): 214–232. doi:10.1080/0954898X.2022.2131921. ISSN   0954-898X. PMID   36200906. S2CID   252736680.
  10. Li, Kexin; Bao, Bocheng; Ma, Jun; Chen, Mo; Bao, Han (2022-12-01). "Synchronization transitions in a discrete memristor-coupled bi-neuron model". Chaos, Solitons & Fractals. 165: 112861. Bibcode:2022CSF...16512861L. doi:10.1016/j.chaos.2022.112861. ISSN   0960-0779. S2CID   253704669.
  11. Orlando, Giuseppe; Bufalo, Michele; Stoop, Ruedi (2022-02-01). "Financial markets' deterministic aspects modeled by a low-dimensional equation". Scientific Reports. 12 (1): 1693. Bibcode:2022NatSR..12.1693O. doi: 10.1038/s41598-022-05765-z . ISSN   2045-2322. PMC   8807815 . PMID   35105929.
  12. Stoop, Ruedi; Orlando, Giuseppe; Bufalo, Michele; Della Rossa, Fabio (2022-11-18). "Exploiting deterministic features in apparently stochastic data". Scientific Reports. 12 (1): 19843. Bibcode:2022NatSR..1219843S. doi: 10.1038/s41598-022-23212-x . hdl: 11311/1233353 . ISSN   2045-2322. PMC   9674651 . PMID   36400910.
  13. Bashkirtseva, Irina; Ryashko, Lev (2023-07-01). "Structural and stochastic transformations in a system of coupled populations". The European Physical Journal Special Topics. 232 (8): 1247–1252. Bibcode:2023EPJST.232.1247I. doi:10.1140/epjs/s11734-022-00762-9. ISSN   1951-6401. S2CID   256042853.