Claudia Clopath

Last updated
Claudia Clopath
Alma mater EPFL (MS, PhD)
Scientific career
Institutions Columbia University
Paris Descartes University
Imperial College London
Thesis Modeling synaptic plasticity across different time scales: the influence of voltage, spike timing, and protein synthesis  (2009)
Doctoral advisor Wulfram Gerstner

Claudia Clopath is a Professor of Computational Neuroscience at Imperial College London and research leader at the Sainsbury Wellcome Centre for Neural Circuits and Behaviour. She develops mathematical models to predict synaptic plasticity for both medical applications and the design of human-like machines.

Contents

Early life and education

Clopath studied physics at École Polytechnique Fédérale de Lausanne. She remained there for her graduate studies, where she worked alongside Wulfram Gerstner. Together they worked on models of spike-timing-dependent plasticity (STPD) that included both the presynaptic and postsynaptic membrane potentials. [1] After earning her PhD she worked as a postdoctoral fellow with Nicolas Brunel at Paris Descartes University. [2] She subsequently joined Columbia University where she worked in the Center for Theoretical Neuroscience. [3]

Research and career

Clopath uses mathematical models to predict synaptic plasticity and to study the implications of synaptic plasticity in artificial neural networks. [4] These models can explain the origins of vibrations in neural networks, and could determine the activities of excitatory and inhibitory neurons. She used this model to explain that inhibitory neurons are important in the determination of the oscillatory frequency of a network. [5] She hopes that the models she generates of the brain will be able to be used in medical applications as well as designing machines that can achieve human-like learning.

She has studied the connections of nerve cells in the visual cortex. [6] The model developed by Clopath, Sandra Sadeh and Stefan Rotter at the Bernstein Center Freiburg was the first to combine biological neural networks in a computational neural network. [6] It allows users to make visual system nerve cells able to detect different features, as well as coordinating the synapses between cells. It can be used to understand how nerve cells develop as they receive information from each eye. [6]

Clopath has worked with DeepMind to create artificial intelligence systems that can be applied to multiple tasks, making them able to remember information or master a series of steps. Together Clopath and DeepMind used synaptic consolidation, a mechanism that allows neural networks to remember. [7] The algorithm, Elastic Weight Consolidation, can compute how important different connections in a neural network are, and apply a weighting factor that dictates its importance. [7] This determines the rate at which values of a node within the neural network are altered. [7] They demonstrated that software that used Elastic Weight Consolidation could learn and achieve human-level performance in ten games. [7] Developing machine learning systems for continual learning tasks has become the focus of Clopath's research, using computational models in recurrent neural networks to establish how inhibition gates synaptic plasticity. [8]

In 2015 she was awarded a Google Faculty Research Award. [9]

Selected publications

Related Research Articles

<span class="mw-page-title-main">Dendrite</span> Small projection on a neuron that receives signals

A dendrite or dendron is a branched protoplasmic extension of a nerve cell that propagates the electrochemical stimulation received from other neural cells to the cell body, or soma, of the neuron from which the dendrites project. Electrical stimulation is transmitted onto dendrites by upstream neurons via synapses which are located at various points throughout the dendritic tree.

<span class="mw-page-title-main">Chemical synapse</span> Biological junctions through which neurons signals can be sent

Chemical synapses are biological junctions through which neurons' signals can be sent to each other and to non-neuronal cells such as those in muscles or glands. Chemical synapses allow neurons to form circuits within the central nervous system. They are crucial to the biological computations that underlie perception and thought. They allow the nervous system to connect to and control other systems of the body.

Computational neuroscience is a branch of neuroscience which employs mathematics, computer science, theoretical analysis and abstractions of the brain to understand the principles that govern the development, structure, physiology and cognitive abilities of the nervous system.

In neuroscience, synaptic plasticity is the ability of synapses to strengthen or weaken over time, in response to increases or decreases in their activity. Since memories are postulated to be represented by vastly interconnected neural circuits in the brain, synaptic plasticity is one of the important neurochemical foundations of learning and memory.

In neurophysiology, long-term depression (LTD) is an activity-dependent reduction in the efficacy of neuronal synapses lasting hours or longer following a long patterned stimulus. LTD occurs in many areas of the CNS with varying mechanisms depending upon brain region and developmental progress.

Spike-timing-dependent plasticity (STDP) is a biological process that adjusts the strength of connections between neurons in the brain. The process adjusts the connection strengths based on the relative timing of a particular neuron's output and input action potentials. The STDP process partially explains the activity-dependent development of nervous systems, especially with regard to long-term potentiation and long-term depression.

<span class="mw-page-title-main">Neural circuit</span> Network or circuit of neurons

A neural circuit is a population of neurons interconnected by synapses to carry out a specific function when activated. Multiple neural circuits interconnect with one another to form large scale brain networks.

Neural coding is a neuroscience field concerned with characterising the hypothetical relationship between the stimulus and the individual or ensemble neuronal responses and the relationship among the electrical activity of the neurons in the ensemble. Based on the theory that sensory and other information is represented in the brain by networks of neurons, it is thought that neurons can encode both digital and analog information.

BCM theory, BCM synaptic modification, or the BCM rule, named for Elie Bienenstock, Leon Cooper, and Paul Munro, is a physical theory of learning in the visual cortex developed in 1981. The BCM model proposes a sliding threshold for long-term potentiation (LTP) or long-term depression (LTD) induction, and states that synaptic plasticity is stabilized by a dynamic adaptation of the time-averaged postsynaptic activity. According to the BCM model, when a pre-synaptic neuron fires, the post-synaptic neurons will tend to undergo LTP if it is in a high-activity state, or LTD if it is in a lower-activity state. This theory is often used to explain how cortical neurons can undergo both LTP or LTD depending on different conditioning stimulus protocols applied to pre-synaptic neurons.

<span class="mw-page-title-main">Nonsynaptic plasticity</span> Form of neuroplasticity

Nonsynaptic plasticity is a form of neuroplasticity that involves modification of ion channel function in the axon, dendrites, and cell body that results in specific changes in the integration of excitatory postsynaptic potentials and inhibitory postsynaptic potentials. Nonsynaptic plasticity is a modification of the intrinsic excitability of the neuron. It interacts with synaptic plasticity, but it is considered a separate entity from synaptic plasticity. Intrinsic modification of the electrical properties of neurons plays a role in many aspects of plasticity from homeostatic plasticity to learning and memory itself. Nonsynaptic plasticity affects synaptic integration, subthreshold propagation, spike generation, and other fundamental mechanisms of neurons at the cellular level. These individual neuronal alterations can result in changes in higher brain function, especially learning and memory. However, as an emerging field in neuroscience, much of the knowledge about nonsynaptic plasticity is uncertain and still requires further investigation to better define its role in brain function and behavior.

A Bayesian Confidence Propagation Neural Network (BCPNN) is an artificial neural network inspired by Bayes' theorem, which regards neural computation and processing as probabilistic inference. Neural unit activations represent probability ("confidence") in the presence of input features or categories, synaptic weights are based on estimated correlations and the spread of activation corresponds to calculating posterior probabilities. It was originally proposed by Anders Lansner and Örjan Ekeberg at KTH Royal Institute of Technology. This probabilistic neural network model can also be run in generative mode to produce spontaneous activations and temporal sequences.

Network of human nervous system comprises nodes that are connected by links. The connectivity may be viewed anatomically, functionally, or electrophysiologically. These are presented in several Wikipedia articles that include Connectionism, Biological neural network, Artificial neural network, Computational neuroscience, as well as in several books by Ascoli, G. A. (2002), Sterratt, D., Graham, B., Gillies, A., & Willshaw, D. (2011), Gerstner, W., & Kistler, W. (2002), and Rumelhart, J. L., McClelland, J. L., and PDP Research Group (1986) among others. The focus of this article is a comprehensive view of modeling a neural network. Once an approach based on the perspective and connectivity is chosen, the models are developed at microscopic, mesoscopic, or macroscopic (system) levels. Computational modeling refers to models that are developed using computing tools.

<span class="mw-page-title-main">Glomerulus (cerebellum)</span>

The cerebellar glomerulus is a small, intertwined mass of nerve fiber terminals in the granular layer of the cerebellar cortex. It consists of post-synaptic granule cell dendrites and pre-synaptic terminals of mossy fibers.

Tim P. Vogels is a professor of theoretical neuroscience and research leader at the Institute of Science and Technology Austria. He is primarily known for his scholarly contributions to the study of neuronal plasticity related to learning and memory in the brain.

<span class="mw-page-title-main">Ila Fiete</span> American physicist

Ila Fiete is an Indian–American physicist and computational neuroscientist as well as a Professor in the Department of Brain and Cognitive Sciences within the McGovern Institute for Brain Research at the Massachusetts Institute of Technology. Fiete builds theoretical models and analyses neural data and to uncover how neural circuits perform computations and how the brain represents and manipulates information involved in memory and reasoning.

An axo-axonic synapse is a type of synapse, formed by one neuron projecting its axon terminals onto another neuron's axon.

<span class="mw-page-title-main">Wulfram Gerstner</span> German neuroscientist

Wulfram Gerstner is a German and Swiss computational neuroscientist. His research focuses on neural spiking patterns in neural networks, and their connection to learning, spatial representation and navigation. Since 2006 Gerstner has been a full professor of Computer Science and Life Sciences at École Polytechnique Fédérale de Lausanne (EPFL), where he also serves as a Director of the Laboratory of Computational Neuroscience.

The spike response model (SRM) is a spiking neuron model in which spikes are generated by either a deterministic or a stochastic threshold process. In the SRM, the membrane voltage V is described as a linear sum of the postsynaptic potentials (PSPs) caused by spike arrivals to which the effects of refractoriness and adaptation are added. The threshold is either fixed or dynamic. In the latter case it increases after each spike. The SRM is flexible enough to account for a variety of neuronal firing pattern in response to step current input. The SRM has also been used in the theory of computation to quantify the capacity of spiking neural networks; and in the neurosciences to predict the subthreshold voltage and the firing times of cortical neurons during stimulation with a time-dependent current stimulation. The name Spike Response Model points to the property that the two important filters and of the model can be interpreted as the response of the membrane potential to an incoming spike (response kernel , the PSP) and to an outgoing spike (response kernel , also called refractory kernel). The SRM has been formulated in continuous time and in discrete time. The SRM can be viewed as a generalized linear model (GLM) or as an (integrated version of) a generalized integrate-and-fire model with adaptation.

<span class="mw-page-title-main">Brain cell</span> Functional tissue of the brain

Brain cells make up the functional tissue of the brain. The rest of the brain tissue is structural or connective called the stroma which includes blood vessels. The two main types of cells in the brain are neurons, also known as nerve cells, and glial cells, also known as neuroglia.

Sonja Hofer is a German neuroscientist studying the neural basis of sensory perception and sensory-guided decision-making at the Sainsbury Wellcome Centre for Neural Circuits and Behaviour. Her research focuses on how the brain processes visual information, how neural networks are shaped by experience and learning, and how they integrate visual signals with other information in order to interpret the outside world and guide behaviour. She received her undergraduate degree from the Technical University of Munich, her PhD at the Max Planck Institute of Neurobiology in Martinsried, Germany, and completed a post doctorate at the University College London. After holding an Assistant Professorship at the Biozentrum University of Basel in Switzerland for five years, she now is a group leader and Professor at the Sainsbury Wellcome Centre for Neural Circuits and Behaviour since 2018.

References

  1. Clopath, Claudia; Büsing, Lars; Vasilaki, Eleni; Gerstner, Wulfram (2010-01-24). "Connectivity reflects coding: a model of voltage-based STDP with homeostasis". Nature Neuroscience. 13 (3): 344–352. doi:10.1038/nn.2479. ISSN   1097-6256. PMID   20098420. S2CID   8046538.
  2. Clopath, Claudia; Brunel, Nicolas (2013-02-21). "Optimal Properties of Analog Perceptrons with Excitatory Weights". PLOS Computational Biology. 9 (2): e1002919. Bibcode:2013PLSCB...9E2919C. doi: 10.1371/journal.pcbi.1002919 . ISSN   1553-7358. PMC   3578758 . PMID   23436991.
  3. "Center for Theoretical Neuroscience | People". www.columbia.edu. Retrieved 2019-10-15.
  4. "Claudia Clopath". www.sainsburywellcome.org. Retrieved 2019-10-15.
  5. "Taktgeber für Hirnwellen". www.mpg.de (in German). Retrieved 2019-10-15.
  6. 1 2 3 "Computer model shows how nerve cell connections form in visual cortex". ScienceDaily. Retrieved 2019-10-15.
  7. 1 2 3 4 Kahn, Jeremy (2017-03-15). "Google's DeepMind finds way to overcome AI's forgetfulness problem". live mint. Archived from the original on 2017-03-15. Retrieved 2019-10-15.
  8. "Brain--inspired disinhihbitory learning rule for continual learning tasks in artificial neural networks". UKRI.
  9. "Google Faculty Research Awards February 2015" (PDF). Google. Archived (PDF) from the original on 2015-09-24. Retrieved 2019-10-15.