Claudia Clopath | |
---|---|
Alma mater | EPFL (MS, PhD) |
Scientific career | |
Institutions | Columbia University Paris Descartes University Imperial College London |
Thesis | Modeling synaptic plasticity across different time scales: the influence of voltage, spike timing, and protein synthesis (2009) |
Doctoral advisor | Wulfram Gerstner |
Claudia Clopath is a Professor of Computational Neuroscience at Imperial College London and research leader at the Sainsbury Wellcome Centre for Neural Circuits and Behaviour. She develops mathematical models to predict synaptic plasticity for both medical applications and the design of human-like machines.
Clopath studied physics at École Polytechnique Fédérale de Lausanne. She remained there for her graduate studies, where she worked alongside Wulfram Gerstner. Together they worked on models of spike-timing-dependent plasticity (STPD) that included both the presynaptic and postsynaptic membrane potentials. [1] After earning her PhD she worked as a postdoctoral fellow with Nicolas Brunel at Paris Descartes University. [2] She subsequently joined Columbia University where she worked in the Center for Theoretical Neuroscience. [3]
Clopath uses mathematical models to predict synaptic plasticity and to study the implications of synaptic plasticity in artificial neural networks. [4] These models can explain the origins of vibrations in neural networks, and could determine the activities of excitatory and inhibitory neurons. She used this model to explain that inhibitory neurons are important in the determination of the oscillatory frequency of a network. [5] She hopes that the models she generates of the brain will be able to be used in medical applications as well as designing machines that can achieve human-like learning.
She has studied the connections of nerve cells in the visual cortex. [6] The model developed by Clopath, Sandra Sadeh and Stefan Rotter at the Bernstein Center Freiburg was the first to combine biological neural networks in a computational neural network. [6] It allows users to make visual system nerve cells able to detect different features, as well as coordinating the synapses between cells. It can be used to understand how nerve cells develop as they receive information from each eye. [6]
Clopath has worked with DeepMind to create artificial intelligence systems that can be applied to multiple tasks, making them able to remember information or master a series of steps. Together Clopath and DeepMind used synaptic consolidation, a mechanism that allows neural networks to remember. [7] The algorithm, Elastic Weight Consolidation, can compute how important different connections in a neural network are, and apply a weighting factor that dictates its importance. [7] This determines the rate at which values of a node within the neural network are altered. [7] They demonstrated that software that used Elastic Weight Consolidation could learn and achieve human-level performance in ten games. [7] Developing machine learning systems for continual learning tasks has become the focus of Clopath's research, using computational models in recurrent neural networks to establish how inhibition gates synaptic plasticity. [8]
In 2015 she was awarded a Google Faculty Research Award. [9]
A dendrite or dendron is a branched protoplasmic extension of a nerve cell that propagates the electrochemical stimulation received from other neural cells to the cell body, or soma, of the neuron from which the dendrites project. Electrical stimulation is transmitted onto dendrites by upstream neurons via synapses which are located at various points throughout the dendritic tree.
Chemical synapses are biological junctions through which neurons' signals can be sent to each other and to non-neuronal cells such as those in muscles or glands. Chemical synapses allow neurons to form circuits within the central nervous system. They are crucial to the biological computations that underlie perception and thought. They allow the nervous system to connect to and control other systems of the body.
Computational neuroscience is a branch of neuroscience which employs mathematics, computer science, theoretical analysis and abstractions of the brain to understand the principles that govern the development, structure, physiology and cognitive abilities of the nervous system.
In neuroscience, synaptic plasticity is the ability of synapses to strengthen or weaken over time, in response to increases or decreases in their activity. Since memories are postulated to be represented by vastly interconnected neural circuits in the brain, synaptic plasticity is one of the important neurochemical foundations of learning and memory.
In neurophysiology, long-term depression (LTD) is an activity-dependent reduction in the efficacy of neuronal synapses lasting hours or longer following a long patterned stimulus. LTD occurs in many areas of the CNS with varying mechanisms depending upon brain region and developmental progress.
Spike-timing-dependent plasticity (STDP) is a biological process that adjusts the strength of connections between neurons in the brain. The process adjusts the connection strengths based on the relative timing of a particular neuron's output and input action potentials. The STDP process partially explains the activity-dependent development of nervous systems, especially with regard to long-term potentiation and long-term depression.
A neural circuit is a population of neurons interconnected by synapses to carry out a specific function when activated. Multiple neural circuits interconnect with one another to form large scale brain networks.
Neural coding is a neuroscience field concerned with characterising the hypothetical relationship between the stimulus and the individual or ensemble neuronal responses and the relationship among the electrical activity of the neurons in the ensemble. Based on the theory that sensory and other information is represented in the brain by networks of neurons, it is thought that neurons can encode both digital and analog information.
BCM theory, BCM synaptic modification, or the BCM rule, named for Elie Bienenstock, Leon Cooper, and Paul Munro, is a physical theory of learning in the visual cortex developed in 1981. The BCM model proposes a sliding threshold for long-term potentiation (LTP) or long-term depression (LTD) induction, and states that synaptic plasticity is stabilized by a dynamic adaptation of the time-averaged postsynaptic activity. According to the BCM model, when a pre-synaptic neuron fires, the post-synaptic neurons will tend to undergo LTP if it is in a high-activity state, or LTD if it is in a lower-activity state. This theory is often used to explain how cortical neurons can undergo both LTP or LTD depending on different conditioning stimulus protocols applied to pre-synaptic neurons.
Nonsynaptic plasticity is a form of neuroplasticity that involves modification of ion channel function in the axon, dendrites, and cell body that results in specific changes in the integration of excitatory postsynaptic potentials and inhibitory postsynaptic potentials. Nonsynaptic plasticity is a modification of the intrinsic excitability of the neuron. It interacts with synaptic plasticity, but it is considered a separate entity from synaptic plasticity. Intrinsic modification of the electrical properties of neurons plays a role in many aspects of plasticity from homeostatic plasticity to learning and memory itself. Nonsynaptic plasticity affects synaptic integration, subthreshold propagation, spike generation, and other fundamental mechanisms of neurons at the cellular level. These individual neuronal alterations can result in changes in higher brain function, especially learning and memory. However, as an emerging field in neuroscience, much of the knowledge about nonsynaptic plasticity is uncertain and still requires further investigation to better define its role in brain function and behavior.
A Bayesian Confidence Propagation Neural Network (BCPNN) is an artificial neural network inspired by Bayes' theorem, which regards neural computation and processing as probabilistic inference. Neural unit activations represent probability ("confidence") in the presence of input features or categories, synaptic weights are based on estimated correlations and the spread of activation corresponds to calculating posterior probabilities. It was originally proposed by Anders Lansner and Örjan Ekeberg at KTH Royal Institute of Technology. This probabilistic neural network model can also be run in generative mode to produce spontaneous activations and temporal sequences.
Network of human nervous system comprises nodes that are connected by links. The connectivity may be viewed anatomically, functionally, or electrophysiologically. These are presented in several Wikipedia articles that include Connectionism, Biological neural network, Artificial neural network, Computational neuroscience, as well as in several books by Ascoli, G. A. (2002), Sterratt, D., Graham, B., Gillies, A., & Willshaw, D. (2011), Gerstner, W., & Kistler, W. (2002), and Rumelhart, J. L., McClelland, J. L., and PDP Research Group (1986) among others. The focus of this article is a comprehensive view of modeling a neural network. Once an approach based on the perspective and connectivity is chosen, the models are developed at microscopic, mesoscopic, or macroscopic (system) levels. Computational modeling refers to models that are developed using computing tools.
The cerebellar glomerulus is a small, intertwined mass of nerve fiber terminals in the granular layer of the cerebellar cortex. It consists of post-synaptic granule cell dendrites and pre-synaptic terminals of mossy fibers.
Tim P. Vogels is a professor of theoretical neuroscience and research leader at the Institute of Science and Technology Austria. He is primarily known for his scholarly contributions to the study of neuronal plasticity related to learning and memory in the brain.
Ila Fiete is an Indian–American physicist and computational neuroscientist as well as a Professor in the Department of Brain and Cognitive Sciences within the McGovern Institute for Brain Research at the Massachusetts Institute of Technology. Fiete builds theoretical models and analyses neural data and to uncover how neural circuits perform computations and how the brain represents and manipulates information involved in memory and reasoning.
An axo-axonic synapse is a type of synapse, formed by one neuron projecting its axon terminals onto another neuron's axon.
Wulfram Gerstner is a German and Swiss computational neuroscientist. His research focuses on neural spiking patterns in neural networks, and their connection to learning, spatial representation and navigation. Since 2006 Gerstner has been a full professor of Computer Science and Life Sciences at École Polytechnique Fédérale de Lausanne (EPFL), where he also serves as a Director of the Laboratory of Computational Neuroscience.
The spike response model (SRM) is a spiking neuron model in which spikes are generated by either a deterministic or a stochastic threshold process. In the SRM, the membrane voltage V is described as a linear sum of the postsynaptic potentials (PSPs) caused by spike arrivals to which the effects of refractoriness and adaptation are added. The threshold is either fixed or dynamic. In the latter case it increases after each spike. The SRM is flexible enough to account for a variety of neuronal firing pattern in response to step current input. The SRM has also been used in the theory of computation to quantify the capacity of spiking neural networks; and in the neurosciences to predict the subthreshold voltage and the firing times of cortical neurons during stimulation with a time-dependent current stimulation. The name Spike Response Model points to the property that the two important filters and of the model can be interpreted as the response of the membrane potential to an incoming spike (response kernel , the PSP) and to an outgoing spike (response kernel , also called refractory kernel). The SRM has been formulated in continuous time and in discrete time. The SRM can be viewed as a generalized linear model (GLM) or as an (integrated version of) a generalized integrate-and-fire model with adaptation.
Brain cells make up the functional tissue of the brain. The rest of the brain tissue is structural or connective called the stroma which includes blood vessels. The two main types of cells in the brain are neurons, also known as nerve cells, and glial cells, also known as neuroglia.
Sonja Hofer is a German neuroscientist studying the neural basis of sensory perception and sensory-guided decision-making at the Sainsbury Wellcome Centre for Neural Circuits and Behaviour. Her research focuses on how the brain processes visual information, how neural networks are shaped by experience and learning, and how they integrate visual signals with other information in order to interpret the outside world and guide behaviour. She received her undergraduate degree from the Technical University of Munich, her PhD at the Max Planck Institute of Neurobiology in Martinsried, Germany, and completed a post doctorate at the University College London. After holding an Assistant Professorship at the Biozentrum University of Basel in Switzerland for five years, she now is a group leader and Professor at the Sainsbury Wellcome Centre for Neural Circuits and Behaviour since 2018.