Spike-timing-dependent plasticity

Last updated

Spike-timing-dependent plasticity (STDP) is a biological process that adjusts the strength of connections between neurons in the brain. The process adjusts the connection strengths based on the relative timing of a particular neuron's output and input action potentials (or spikes). The STDP process partially explains the activity-dependent development of nervous systems, especially with regard to long-term potentiation and long-term depression.

Contents

Process

Under the STDP process, if an input spike to a neuron tends, on average, to occur immediately before that neuron's output spike, then that particular input is made somewhat stronger. If an input spike tends, on average, to occur immediately after an output spike, then that particular input is made somewhat weaker hence: "spike-timing-dependent plasticity". Thus, inputs that might be the cause of the post-synaptic neuron's excitation are made even more likely to contribute in the future, whereas inputs that are not the cause of the post-synaptic spike are made less likely to contribute in the future. The process continues until a subset of the initial set of connections remain, while the influence of all others is reduced to 0. Since a neuron produces an output spike when many of its inputs occur within a brief period, the subset of inputs that remain are those that tended to be correlated in time. In addition, since the inputs that occur before the output are strengthened, the inputs that provide the earliest indication of correlation will eventually become the final input to the neuron.

History

In 1973, M. M. Taylor [1] suggested that if synapses were strengthened for which a presynaptic spike occurred just before a postsynaptic spike more often than the reverse (Hebbian learning), while with the opposite timing or in the absence of a closely timed presynaptic spike, synapses were weakened (anti-Hebbian learning), the result would be an informationally efficient recoding of input patterns. This proposal apparently passed unnoticed in the neuroscientific community, and subsequent experimentation was conceived independently of these early suggestions.

Early experiments on associative plasticity were carried out by W. B. Levy and O. Steward in 1983 [2] and examined the effect of relative timing of pre- and postsynaptic action potentials at millisecond level on plasticity. Bruce McNaughton contributed much to this area, too. In studies on neuromuscular synapses carried out by Y. Dan and Mu-ming Poo in 1992, [3] and on the hippocampus by D. Debanne, B. Gähwiler, and S. Thompson in 1994, [4] showed that asynchronous pairing of postsynaptic and synaptic activity induced long-term synaptic depression. However, STDP was more definitively demonstrated by Henry Markram in his postdoc period till 1993 in Bert Sakmann's lab (SFN and Phys Soc abstracts in 1994–1995) which was only published in 1997. [5] C. Bell and co-workers also found a form of STDP in the cerebellum. Henry Markram used dual patch clamping techniques to repetitively activate pre-synaptic neurons 10 milliseconds before activating the post-synaptic target neurons, and found the strength of the synapse increased. When the activation order was reversed so that the pre-synaptic neuron was activated 10 milliseconds after its post-synaptic target neuron, the strength of the pre-to-post synaptic connection decreased. Further work, by Guoqiang Bi, Li Zhang, and Huizhong Tao in Mu-Ming Poo's lab in 1998, [6] continued the mapping of the entire time course relating pre- and post-synaptic activity and synaptic change, to show that in their preparation synapses that are activated within 5–20 ms before a postsynaptic spike are strengthened, and those that are activated within a similar time window after the spike are weakened. This phenomenon has been observed in various other preparations, with some variation in the time-window relevant for plasticity. Several reasons for timing-dependent plasticity have been suggested. For example, STDP might provide a substrate for Hebbian learning during development, [7] [8] or, as suggested by Taylor [1] in 1973, the associated Hebbian and anti-Hebbian learning rules might create informationally efficient coding in bundles of related neurons. Works from Y. Dan's lab advanced to study STDP in in vivo systems. [9]

Biological mechanisms

Postsynaptic NMDA receptors (NMDARs) are highly sensitive to the membrane potential (see coincidence detection in neurobiology). Due to their high permeability for calcium, they generate a local chemical signal that is largest when the back-propagating action potential in the dendrite arrives shortly after the synapse was active (pre-post spiking). Large postsynaptic calcium transients are known to trigger synaptic potentiation (long-term potentiation). The mechanism for spike-timing-dependent depression is less well understood, but often involves either postsynaptic voltage-dependent calcium entry/mGluR activation, or retrograde endocannabinoids and presynaptic NMDARs. [10]

From Hebbian rule to STDP

According to the Hebbian rule, synapses increase their efficiency if the synapse persistently takes part in firing the postsynaptic target neuron. Similarly, the efficiency of synapses decreases when the firing of their presynaptic targets is persistently independent of firing their postsynaptic ones. These principles are often simplified in the mnemonics: those who fire together, wire together; and those who fire out of sync, lose their link. However, if two neurons fire exactly at the same time, then one cannot have caused, or taken part in firing the other. Instead, to take part in firing the postsynaptic neuron, the presynaptic neuron needs to fire just before the postsynaptic neuron. Experiments that stimulated two connected neurons with varying interstimulus asynchrony confirmed the importance of temporal relation implicit in Hebb's principle: for the synapse to be potentiated or depressed, the presynaptic neuron has to fire just before or just after the postsynaptic neuron, respectively. [11] In addition, it has become evident that the presynaptic neural firing needs to consistently predict the postsynaptic firing for synaptic plasticity to occur robustly, [12] mirroring at a synaptic level what is known about the importance of contingency in classical conditioning, where zero contingency procedures prevent the association between two stimuli.

Role in hippocampal learning

For the most efficient STDP, the presynaptic and postsynaptic signal has to be separated by approximately a dozen milliseconds. However, events happening within a couple of minutes can typically be linked together by the hippocampus as episodic memories. To resolve this contradiction, a mechanism relying on the theta waves and the phase precession has been proposed: Representations of different memory entities (such as a place, face, person etc.) are repeated on each theta cycle at a given theta phase during the episode to be remembered. Expected, ongoing, and completed entities have early, intermediate and late theta phases, respectively. In the CA3 region of the hippocampus, the recurrent network turns entities with neighboring theta phases into coincident ones thereby allowing STDP to link them together. Experimentally detectable memory sequences are created this way by reinforcing the connection between subsequent (neighboring) representations. [13]

Computational models and applications

Training spiking neural networks

The principles of STDP can be utilized in the training of artificial spiking neural networks. Using this approach the weight of a connection between two neurons is increased if the time at which a presynaptic spike () occurs shortly before the time of a post synaptic spike(), ie. and . The size of the weight increase is dependent on the value of and decreases exponentially as the value of increases given by the equation:

where is the maximum possible change and is the time constant.

If the opposite scenario occurs ie a post synaptic spike occurs before a presynaptic spike then the weight is instead reduced according to the equation:

Where and serve the same function of defining the maximum possible change and time constant as before respectively.

The parameters that define the decay profile (,, etc.) do not necessarily have to be fixed across the entire network and different synapses may have different shapes associated with them.

Biological evidence suggests that this pairwise STDP approach cannot give a complete description of a biological neuron and more advanced approaches which look at symmetric triplets of spikes (pre-post-pre, post-pre-post) have been developed and these are believed to be more biologically plausible. [14]

See also

Related Research Articles

<span class="mw-page-title-main">Chemical synapse</span> Biological junctions through which neurons signals can be sent

Chemical synapses are biological junctions through which neurons' signals can be sent to each other and to non-neuronal cells such as those in muscles or glands. Chemical synapses allow neurons to form circuits within the central nervous system. They are crucial to the biological computations that underlie perception and thought. They allow the nervous system to connect to and control other systems of the body.

<span class="mw-page-title-main">Long-term potentiation</span> Persistent strengthening of synapses based on recent patterns of activity

In neuroscience, long-term potentiation (LTP) is a persistent strengthening of synapses based on recent patterns of activity. These are patterns of synaptic activity that produce a long-lasting increase in signal transmission between two neurons. The opposite of LTP is long-term depression, which produces a long-lasting decrease in synaptic strength.

Hebbian theory is a neuropsychological theory claiming that an increase in synaptic efficacy arises from a presynaptic cell's repeated and persistent stimulation of a postsynaptic cell. It is an attempt to explain synaptic plasticity, the adaptation of brain neurons during the learning process. It was introduced by Donald Hebb in his 1949 book The Organization of Behavior. The theory is also called Hebb's rule, Hebb's postulate, and cell assembly theory. Hebb states it as follows:

Let us assume that the persistence or repetition of a reverberatory activity tends to induce lasting cellular changes that add to its stability. ... When an axon of cell A is near enough to excite a cell B and repeatedly or persistently takes part in firing it, some growth process or metabolic change takes place in one or both cells such that A’s efficiency, as one of the cells firing B, is increased.

In neuroscience, synaptic plasticity is the ability of synapses to strengthen or weaken over time, in response to increases or decreases in their activity. Since memories are postulated to be represented by vastly interconnected neural circuits in the brain, synaptic plasticity is one of the important neurochemical foundations of learning and memory.

An inhibitory postsynaptic potential (IPSP) is a kind of synaptic potential that makes a postsynaptic neuron less likely to generate an action potential. The opposite of an inhibitory postsynaptic potential is an excitatory postsynaptic potential (EPSP), which is a synaptic potential that makes a postsynaptic neuron more likely to generate an action potential. IPSPs can take place at all chemical synapses, which use the secretion of neurotransmitters to create cell-to-cell signalling. EPSPs and IPSPs compete with each other at numerous synapses of a neuron. This determines whether an action potential occurring at the presynaptic terminal produces an action potential at the postsynaptic membrane. Some common neurotransmitters involved in IPSPs are GABA and glycine.

In neurophysiology, long-term depression (LTD) is an activity-dependent reduction in the efficacy of neuronal synapses lasting hours or longer following a long patterned stimulus. LTD occurs in many areas of the CNS with varying mechanisms depending upon brain region and developmental progress.

In neuroscience, a silent synapse is an excitatory glutamatergic synapse whose postsynaptic membrane contains NMDA-type glutamate receptors but no AMPA-type glutamate receptors. These synapses are named "silent" because normal AMPA receptor-mediated signaling is not present, rendering the synapse inactive under typical conditions. Silent synapses are typically considered to be immature glutamatergic synapses. As the brain matures, the relative number of silent synapses decreases. However, recent research on hippocampal silent synapses shows that while they may indeed be a developmental landmark in the formation of a synapse, that synapses can be "silenced" by activity, even once they have acquired AMPA receptors. Thus, silence may be a state that synapses can visit many times during their lifetimes.

<span class="mw-page-title-main">Neural circuit</span> Network or circuit of neurons

A neural circuit is a population of neurons interconnected by synapses to carry out a specific function when activated. Multiple neural circuits interconnect with one another to form large scale brain networks.

Schaffer collaterals are axon collaterals given off by CA3 pyramidal cells in the hippocampus. These collaterals project to area CA1 of the hippocampus and are an integral part of memory formation and the emotional network of the Papez circuit, and of the hippocampal trisynaptic loop. It is one of the most studied synapses in the world and named after the Hungarian anatomist-neurologist Károly Schaffer.

<span class="mw-page-title-main">Golgi cell</span>

In neuroscience, Golgi cells are the most abundant inhibitory interneurons found within the granular layer of the cerebellum. Golgi cells can be found in the granular layer at various layers. The Golgi cell is essential for controlling the activity of the granular layer. They were first identified as inhibitory in 1964. It was also the first example of an inhibitory feedback network in which the inhibitory interneuron was identified anatomically. Golgi cells produce a wide lateral inhibition that reaches beyond the afferent synaptic field and inhibit granule cells via feedforward and feedback inhibitory loops. These cells synapse onto the dendrite of granule cells and unipolar brush cells. They receive excitatory input from mossy fibres, also synapsing on granule cells, and parallel fibers, which are long granule cell axons. Thereby this circuitry allows for feed-forward and feed-back inhibition of granule cells.

Bursting, or burst firing, is an extremely diverse general phenomenon of the activation patterns of neurons in the central nervous system and spinal cord where periods of rapid action potential spiking are followed by quiescent periods much longer than typical inter-spike intervals. Bursting is thought to be important in the operation of robust central pattern generators, the transmission of neural codes, and some neuropathologies such as epilepsy. The study of bursting both directly and in how it takes part in other neural phenomena has been very popular since the beginnings of cellular neuroscience and is closely tied to the fields of neural synchronization, neural coding, plasticity, and attention.

BCM theory, BCM synaptic modification, or the BCM rule, named for Elie Bienenstock, Leon Cooper, and Paul Munro, is a physical theory of learning in the visual cortex developed in 1981. The BCM model proposes a sliding threshold for long-term potentiation (LTP) or long-term depression (LTD) induction, and states that synaptic plasticity is stabilized by a dynamic adaptation of the time-averaged postsynaptic activity. According to the BCM model, when a pre-synaptic neuron fires, the post-synaptic neurons will tend to undergo LTP if it is in a high-activity state, or LTD if it is in a lower-activity state. This theory is often used to explain how cortical neurons can undergo both LTP or LTD depending on different conditioning stimulus protocols applied to pre-synaptic neurons.

Neural backpropagation is the phenomenon in which, after the action potential of a neuron creates a voltage spike down the axon, another impulse is generated from the soma and propagates towards the apical portions of the dendritic arbor or dendrites. In addition to active backpropagation of the action potential, there is also passive electrotonic spread. While there is ample evidence to prove the existence of backpropagating action potentials, the function of such action potentials and the extent to which they invade the most distal dendrites remain highly controversial.

Coincidence detection is a neuronal process in which a neural circuit encodes information by detecting the occurrence of temporally close but spatially distributed input signals. Coincidence detectors influence neuronal information processing by reducing temporal jitter and spontaneous activity, allowing the creation of variable associations between separate neural events in memory. The study of coincidence detectors has been crucial in neuroscience with regards to understanding the formation of computational maps in the brain.

<span class="mw-page-title-main">Dendritic spike</span> Action potential generated in the dendrite of a neuron

In neurophysiology, a dendritic spike refers to an action potential generated in the dendrite of a neuron. Dendrites are branched extensions of a neuron. They receive electrical signals emitted from projecting neurons and transfer these signals to the cell body, or soma. Dendritic signaling has traditionally been viewed as a passive mode of electrical signaling. Unlike its axon counterpart which can generate signals through action potentials, dendrites were believed to only have the ability to propagate electrical signals by physical means: changes in conductance, length, cross sectional area, etc. However, the existence of dendritic spikes was proposed and demonstrated by W. Alden Spencer, Eric Kandel, Rodolfo Llinás and coworkers in the 1960s and a large body of evidence now makes it clear that dendrites are active neuronal structures. Dendrites contain voltage-gated ion channels giving them the ability to generate action potentials. Dendritic spikes have been recorded in numerous types of neurons in the brain and are thought to have great implications in neuronal communication, memory, and learning. They are one of the major factors in long-term potentiation.

<span class="mw-page-title-main">Nonsynaptic plasticity</span> Form of neuroplasticity

Nonsynaptic plasticity is a form of neuroplasticity that involves modification of ion channel function in the axon, dendrites, and cell body that results in specific changes in the integration of excitatory postsynaptic potentials and inhibitory postsynaptic potentials. Nonsynaptic plasticity is a modification of the intrinsic excitability of the neuron. It interacts with synaptic plasticity, but it is considered a separate entity from synaptic plasticity. Intrinsic modification of the electrical properties of neurons plays a role in many aspects of plasticity from homeostatic plasticity to learning and memory itself. Nonsynaptic plasticity affects synaptic integration, subthreshold propagation, spike generation, and other fundamental mechanisms of neurons at the cellular level. These individual neuronal alterations can result in changes in higher brain function, especially learning and memory. However, as an emerging field in neuroscience, much of the knowledge about nonsynaptic plasticity is uncertain and still requires further investigation to better define its role in brain function and behavior.

A Bayesian Confidence Propagation Neural Network (BCPNN) is an artificial neural network inspired by Bayes' theorem, which regards neural computation and processing as probabilistic inference. Neural unit activations represent probability ("confidence") in the presence of input features or categories, synaptic weights are based on estimated correlations and the spread of activation corresponds to calculating posterior probabilities. It was originally proposed by Anders Lansner and Örjan Ekeberg at KTH Royal Institute of Technology. This probabilistic neural network model can also be run in generative mode to produce spontaneous activations and temporal sequences.

In neuroscience, synaptic scaling is a form of homeostatic plasticity, in which the brain responds to chronically elevated activity in a neural circuit with negative feedback, allowing individual neurons to reduce their overall action potential firing rate. Where Hebbian plasticity mechanisms modify neural synaptic connections selectively, synaptic scaling normalizes all neural synaptic connections by decreasing the strength of each synapse by the same factor, so that the relative synaptic weighting of each synapse is preserved.

<span class="mw-page-title-main">Homosynaptic plasticity</span> Type of synaptic plasticity.

Homosynaptic plasticity is one type of synaptic plasticity. Homosynaptic plasticity is input-specific, meaning changes in synapse strength occur only at post-synaptic targets specifically stimulated by a pre-synaptic target. Therefore, the spread of the signal from the pre-synaptic cell is localized.

The Tempotron is a supervised synaptic learning algorithm which is applied when the information is encoded in spatiotemporal spiking patterns. This is an advancement of the perceptron which does not incorporate a spike timing framework.

References

  1. 1 2 Taylor MM (1973). "The Problem of Stimulus Structure in the Behavioural Theory of Perception". South African Journal of Psychology. 3: 23–45.
  2. Levy WB, Steward O (April 1983). "Temporal contiguity requirements for long-term associative potentiation/depression in the hippocampus". Neuroscience. 8 (4): 791–7. CiteSeerX   10.1.1.365.5814 . doi:10.1016/0306-4522(83)90010-6. PMID   6306504. S2CID   16184572.
  3. Dan Y, Poo MM (1992). "Hebbian depression of isolated neuromuscular synapses in vitro". Science. 256 (5063): 1570–73. Bibcode:1992Sci...256.1570D. doi:10.1126/science.1317971. PMID   1317971.
  4. Debanne D, Gähwiler B, Thompson S (1994). "Asynchronous pre- and postsynaptic activity induces associative long-term depression in area CA1 of the rat hippocampus in vitro". Proceedings of the National Academy of Sciences of the United States of America. 91 (3): 1148–52. Bibcode:1994PNAS...91.1148D. doi: 10.1073/pnas.91.3.1148 . PMC   521471 . PMID   7905631.
  5. Markram H, Lübke J, Frotscher M, Sakmann B (January 1997). "Regulation of synaptic efficacy by coincidence of postsynaptic APs and EPSPs" (PDF). Science. 275 (5297): 213–5. doi:10.1126/science.275.5297.213. PMID   8985014. S2CID   46640132.
  6. Bi GQ, Poo MM (15 December 1998). "Synaptic modifications in cultured hippocampal neurons: dependence on spike timing, synaptic strength, and postsynaptic cell type". Journal of Neuroscience. 18 (24): 10464–72. doi:10.1523/JNEUROSCI.18-24-10464.1998. PMC   6793365 . PMID   9852584.
  7. Gerstner W, Kempter R, van Hemmen JL, Wagner H (September 1996). "A neuronal learning rule for sub-millisecond temporal coding". Nature. 383 (6595): 76–78. Bibcode:1996Natur.383...76G. doi:10.1038/383076a0. PMID   8779718. S2CID   4319500.
  8. Song S, Miller KD, Abbott LF (September 2000). "Competitive Hebbian learning through spike-timing-dependent synaptic plasticity". Nature Neuroscience. 3 (9): 919–26. doi:10.1038/78829. PMID   10966623. S2CID   9530143.
  9. Meliza CD, Dan Y (2006), "Receptive-field modification in rat visual cortex induced by paired visual stimulation and single-cell spiking", Neuron, 49 (2): 183–189, doi: 10.1016/j.neuron.2005.12.009 , PMID   16423693
  10. Sjöström, Per Jesper; Turrigiano, Gina G; Nelson, Sacha B (2003-08-14). "Neocortical LTD via Coincident Activation of Presynaptic NMDA and Cannabinoid Receptors". Neuron. 39 (4): 641–654. doi: 10.1016/S0896-6273(03)00476-8 . PMID   12925278. S2CID   9111561.
  11. Caporale N.; Dan Y. (2008). "Spike timing-dependent plasticity: a Hebbian learning rule". Annual Review of Neuroscience. 31: 25–46. doi:10.1146/annurev.neuro.31.060407.125639. PMID   18275283.
  12. Bauer E. P.; LeDoux J. E.; Nader K. (2001). "Fear conditioning and LTP in the lateral amygdala are sensitive to the same stimulus contingencies". Nature Neuroscience. 4 (7): 687–688. doi:10.1038/89465. PMID   11426221. S2CID   33130204.
  13. Kovács KA (September 2020). "Episodic Memories: How do the Hippocampus and the Entorhinal Ring Attractors Cooperate to Create Them?". Frontiers in Systems Neuroscience. 14: 68. doi: 10.3389/fnsys.2020.559186 . PMC   7511719 . PMID   33013334.
  14. Taherkhani, Aboozar; Belatreche, Ammar; Li, Yuhua; Cosma, Georgina; Maguire, Liam P.; McGinnity, T. M. (2020-02-01). "A review of learning in biologically plausible spiking neural networks". Neural Networks. 122: 253–272. doi:10.1016/j.neunet.2019.09.036. ISSN   0893-6080.

Further reading