Synaptic weight

Last updated

In neuroscience and computer science, synaptic weight refers to the strength or amplitude of a connection between two nodes, corresponding in biology to the amount of influence the firing of one neuron has on another. The term is typically used in artificial and biological neural network research. [1]

Contents

Computation

In a computational neural network, a vector or set of inputs and outputs , or pre- and post-synaptic neurons respectively, are interconnected with synaptic weights represented by the matrix , where for a linear neuron

.

where the rows of the synaptic matrix represent the vector of synaptic weights for the output indexed by .

The synaptic weight is changed by using a learning rule, the most basic of which is Hebb's rule, which is usually stated in biological terms as

Neurons that fire together, wire together.

Computationally, this means that if a large signal from one of the input neurons results in a large signal from one of the output neurons, then the synaptic weight between those two neurons will increase. The rule is unstable, however, and is typically modified using such variations as Oja's rule, radial basis functions or the backpropagation algorithm.

Biology

For biological networks, the effect of synaptic weights is not as simple as for linear neurons or Hebbian learning. However, biophysical models such as BCM theory have seen some success in mathematically describing these networks.

In the mammalian central nervous system, signal transmission is carried out by interconnected networks of nerve cells, or neurons. For the basic pyramidal neuron, the input signal is carried by the axon, which releases neurotransmitter chemicals into the synapse which is picked up by the dendrites of the next neuron, which can then generate an action potential which is analogous to the output signal in the computational case.

The synaptic weight in this process is determined by several variable factors:

The changes in synaptic weight that occur is known as synaptic plasticity, and the process behind long-term changes (long-term potentiation and depression) is still poorly understood. Hebb's original learning rule was originally applied to biological systems, but has had to undergo many modifications as a number of theoretical and experimental problems came to light.

Related Research Articles

Dendrite Small projection on a neuron that receive signals

Dendrites, also dendrons, are branched protoplasmic extensions of a nerve cell that propagate the electrochemical stimulation received from other neural cells to the cell body, or soma, of the neuron from which the dendrites project. Electrical stimulation is transmitted onto dendrites by upstream neurons via synapses which are located at various points throughout the dendritic tree. Dendrites play a critical role in integrating these synaptic inputs and in determining the extent to which action potentials are produced by the neuron. Dendritic arborization, also known as dendritic branching, is a multi-step biological process by which neurons form new dendritic trees and branches to create new synapses. The morphology of dendrites such as branch density and grouping patterns are highly correlated to the function of the neuron. Malformation of dendrites is also tightly correlated to impaired nervous system function. Some disorders that are associated with the malformation of dendrites are autism, depression, schizophrenia, Down syndrome and anxiety.

Chemical synapse

Chemical synapses are biological junctions through which neurons' signals can be sent to each other and to non-neuronal cells such as those in muscles or glands. Chemical synapses allow neurons to form circuits within the central nervous system. They are crucial to the biological computations that underlie perception and thought. They allow the nervous system to connect to and control other systems of the body.

An artificial neuron is a mathematical function conceived as a model of biological neurons, a neural network. Artificial neurons are elementary units in an artificial neural network. The artificial neuron receives one or more inputs and sums them to produce an output. Usually each input is separately weighted, and the sum is passed through a non-linear function known as an activation function or transfer function. The transfer functions usually have a sigmoid shape, but they may also take the form of other non-linear functions, piecewise linear functions, or step functions. They are also often monotonically increasing, continuous, differentiable and bounded. The thresholding function has inspired building logic gates referred to as threshold logic; applicable to building logic circuits resembling brain processing. For example, new devices such as memristors have been extensively used to develop such logic in recent times.

Hebbian theory is a neuroscientific theory claiming that an increase in synaptic efficacy arises from a presynaptic cell's repeated and persistent stimulation of a postsynaptic cell. It is an attempt to explain synaptic plasticity, the adaptation of brain neurons during the learning process. It was introduced by Donald Hebb in his 1949 book The Organization of Behavior. The theory is also called Hebb's rule, Hebb's postulate, and cell assembly theory. Hebb states it as follows:

Let us assume that the persistence or repetition of a reverberatory activity tends to induce lasting cellular changes that add to its stability. ... When an axon of cell A is near enough to excite a cell B and repeatedly or persistently takes part in firing it, some growth process or metabolic change takes place in one or both cells such that A's efficiency, as one of the cells firing B, is increased.

In neuroscience, synaptic plasticity is the ability of synapses to strengthen or weaken over time, in response to increases or decreases in their activity. Since memories are postulated to be represented by vastly interconnected neural circuits in the brain, synaptic plasticity is one of the important neurochemical foundations of learning and memory.

Spike-timing-dependent plasticity (STDP) is a biological process that adjusts the strength of connections between neurons in the brain. The process adjusts the connection strengths based on the relative timing of a particular neuron's output and input action potentials. The STDP process partially explains the activity-dependent development of nervous systems, especially with regard to long-term potentiation and long-term depression.

Neural circuit Network or circuit of neurons

A neural circuit is a population of neurons interconnected by synapses to carry out a specific function when activated. Neural circuits interconnect to one another to form large scale brain networks. Biological neural networks have inspired the design of artificial neural networks, but artificial neural networks are usually not strict copies of their biological counterparts.

Computational neurogenetic modeling (CNGM) is concerned with the study and development of dynamic neuronal models for modeling brain functions with respect to genes and dynamic interactions between genes. These include neural network models and their integration with gene network models. This area brings together knowledge from various scientific disciplines, such as computer and information science, neuroscience and cognitive science, genetics and molecular biology, as well as engineering.

Oja's learning rule, or simply Oja's rule, named after Finnish computer scientist Erkki Oja, is a model of how neurons in the brain or in artificial neural networks change connection strength, or learn, over time. It is a modification of the standard Hebb's Rule that, through multiplicative normalization, solves all stability problems and generates an algorithm for principal components analysis. This is a computational form of an effect which is believed to happen in biological neurons.

BCM theory, BCM synaptic modification, or the BCM rule, named for Elie Bienenstock, Leon Cooper, and Paul Munro, is a physical theory of learning in the visual cortex developed in 1981. The BCM model proposes a sliding threshold for long-term potentiation (LTP) or long-term depression (LTD) induction, and states that synaptic plasticity is stabilized by a dynamic adaptation of the time-averaged postsynaptic activity. According to the BCM model, when a pre-synaptic neuron fires, the post-synaptic neurons will tend to undergo LTP if it is in a high-activity state, or LTD if it is in a lower-activity state. This theory is often used to explain how cortical neurons can undergo both LTP or LTD depending on different conditioning stimulus protocols applied to pre-synaptic neurons.

Neural backpropagation is the phenomenon in which after the action potential of a neuron creates a voltage spike down the axon another impulse is generated from the soma and propagates toward to the apical portions of the dendritic arbor or dendrites, from which much of the original input current originated. In addition to active backpropagation of the action potential, there is also passive electrotonic spread. While there is ample evidence to prove the existence of backpropagating action potentials, the function of such action potentials and the extent to which they invade the most distal dendrites remains highly controversial.

The generalized Hebbian algorithm (GHA), also known in the literature as Sanger's rule, is a linear feedforward neural network model for unsupervised learning with applications primarily in principal components analysis. First defined in 1989, it is similar to Oja's rule in its formulation and stability, except it can be applied to networks with multiple outputs. The name originates because of the similarity between the algorithm and a hypothesis made by Donald Hebb about the way in which synaptic strengths in the brain are modified in response to experience, i.e., that changes are proportional to the correlation between the firing of pre- and post-synaptic neurons.

In neuroethology and the study of learning, anti-Hebbian learning describes a particular class of learning rule by which synaptic plasticity can be controlled. These rules are based on a reversal of Hebb's postulate, and therefore can be simplistically understood as dictating reduction of the strength of synaptic connectivity between neurons following a scenario in which a neuron directly contributes to production of an action potential in another neuron.

Dendritic spike

In neurophysiology, a dendritic spike refers to an action potential generated in the dendrite of a neuron. Dendrites are branched extensions of a neuron. They receive electrical signals emitted from projecting neurons and transfer these signals to the cell body, or soma. Dendritic signaling has traditionally been viewed as a passive mode of electrical signaling. Unlike its axon counterpart which can generate signals through action potentials, dendrites were believed to only have the ability to propagate electrical signals by physical means: changes in conductance, length, cross sectional area, etc. However, the existence of dendritic spikes was proposed and demonstrated by W. Alden Spencer, Eric Kandel, Rodolfo Llinás and coworkers in the 1960s and a large body of evidence now makes it clear that dendrites are active neuronal structures. Dendrites contain voltage-gated ion channels giving them the ability to generate action potentials. Dendritic spikes have been recorded in numerous types of neurons in the brain and are thought to have great implications in neuronal communication, memory, and learning. They are one of the major factors in long-term potentiation.

Nonsynaptic plasticity

Nonsynaptic plasticity is a form of neuroplasticity that involves modification of ion channel function in the axon, dendrites, and cell body that results in specific changes in the integration of excitatory postsynaptic potentials (EPSPs) and inhibitory postsynaptic potentials (IPSPs). Nonsynaptic plasticity is a modification of the intrinsic excitability of the neuron. It interacts with synaptic plasticity, but it is considered a separate entity from synaptic plasticity. Intrinsic modification of the electrical properties of neurons plays a role in many aspects of plasticity from homeostatic plasticity to learning and memory itself. Nonsynaptic plasticity affects synaptic integration, subthreshold propagation, spike generation, and other fundamental mechanisms of neurons at the cellular level. These individual neuronal alterations can result in changes in higher brain function, especially learning and memory. However, as an emerging field in neuroscience, much of the knowledge about nonsynaptic plasticity is uncertain and still requires further investigation to better define its role in brain function and behavior.

A Bayesian Confidence Propagation Neural Network (BCPNN) is an artificial neural network inspired by Bayes' theorem, which regards neural computation and processing as probabilistic inference. Neural unit activations represent probability ("confidence") in the presence of input features or categories, synaptic weights are based on estimated correlations and the spread of activation corresponds to calculating posterior probabilities. It was originally proposed by Anders Lansner and Örjan Ekeberg at KTH Royal Institute of Technology. This probabilistic neural network model can also be run in generative mode to produce spontaneous activations and temporal sequences.

Network of human nervous system comprises nodes that are connected by links. The connectivity may be viewed anatomically, functionally, or electrophysiologically. These are presented in several Wikipedia articles that include Connectionism, Biological neural network, Artificial neural network, Computational neuroscience, as well as in several books by Ascoli, G. A. (2002), Sterratt, D., Graham, B., Gillies, A., & Willshaw, D. (2011), Gerstner, W., & Kistler, W. (2002), and Rumelhart, J. L., McClelland, J. L., and PDP Research Group (1986) among others. The focus of this article is a comprehensive view of modeling a neural network. Once an approach based on the perspective and connectivity is chosen, the models are developed at microscopic, mesoscopic, or macroscopic (system) levels. Computational modeling refers to models that are developed using computing tools.

In neuroscience, synaptic scaling is a form of homeostatic plasticity, in which the brain responds to chronically elevated activity in a neural circuit with negative feedback, allowing individual neurons to reduce their overall action potential firing rate. Where Hebbian plasticity mechanisms modify neural synaptic connections selectively, synaptic scaling normalizes all neural synaptic connections by decreasing the strength of each synapse by the same factor, so that the relative synaptic weighting of each synapse is preserved.

Homosynaptic plasticity

Homosynaptic plasticity is one type of synaptic plasticity. Homosynaptic plasticity is input-specific, meaning changes in synapse strength occur only at post-synaptic targets specifically stimulated by a pre-synaptic target. Therefore, the spread of the signal from the pre-synaptic cell is localized.

Heterosynaptic plasticity chemical synapses ability to undergo changes in strength

Synaptic plasticity refers to a chemical synapse's ability to undergo changes in strength. Synaptic plasticity is typically input-specific, meaning that the activity in a particular neuron alters the efficacy of a synaptic connection between that neuron and its target. However, in the case of heterosynaptic plasticity, the activity of a particular neuron leads to input unspecific changes in the strength of synaptic connections from other unactivated neurons. A number of distinct forms of heterosynaptic plasticity have been found in a variety of brain regions and organisms. These different forms of heterosynaptic plasticity contribute to a variety of neural processes including associative learning, the development of neural circuits, and homeostasis of synaptic input.

References

  1. Iyer, R; Menon, V; Buice, M; Koch, C; Mihalas, S (2013). "The influence of synaptic weight distribution on neuronal population dynamics". PLOS Computational Biology. 9 (10): e1003248. Bibcode:2013PLSCB...9E3248I. doi:10.1371/journal.pcbi.1003248. PMC   3808453 . PMID   24204219.


See also