Computation and Neural Systems

Last updated

The Computation and Neural Systems (CNS) program was established at the California Institute of Technology in 1986 with the goal of training PhD students interested in exploring the relationship between the structure of neuron-like circuits/networks and the computations performed in such systems, whether natural or synthetic. The program was designed to foster the exchange of ideas and collaboration among engineers, neuroscientists, and theoreticians.

Contents

History

In the early 1980s, having laid out the foundations of VLSI, [1] Carver Mead became interested in exploring the similarities between computation done in the brain and the type of computations that could be carried out in analog silicon electronic circuits. Mead joined with John Hopfield, who was studying the theoretical foundations of neural computation, [2] to expand his study. Mead and Hopfield's first joint course in this area was entitled “Physics of Computation”; Hopfield teaching about his work in neural networks and Mead about his work in the area of replicating neuronal structures in highly integrated electronic circuits. [3] Given the interest among both students and faculty, they decided to expand upon these themes in the following year. Richard Feynman joined them and three separate courses resulted: Hopfield's on neural networks, Mead's on neuromorphic analog circuits, [4] and Feynman's course on the physics of computation. [3] [5] At this point, Mead and Hopfield realized that a new field was emerging with neural scientists and the people doing the computer models and circuits all talking to each other.

In the fall of 1986, John Hopfield championed forming an interdisciplinary Ph.D. program to give birth to a scholarly community studying questions arising at the interface between neurobiology and electrical engineering, computer science and physics. It was called Computation and Neural Systems (CNS). The unifying theme of the program was the relationship between the physical structure of a computational system (physical or biological hardware), the dynamics of its operation and the computational problems that it can efficiently solve. The creation of this multidisciplinary program stems largely from progress on several previously unrelated fronts: the analysis of complex neural systems at both the single-cell and the network levels [6] using a variety of techniques (in particular, patch clamp recordings, intracellular and extra-cellular single and multi-unit electrophysiology in the awake animal and functional brain imaging techniques, such as functional magnetic resonance imaging (fMRI)), the theoretical analysis of nervous structures (computational neuroscience) and the modeling of artificial neural networks for engineering purposes. [2] The program started out with a small number of existing faculty in the various divisions. Amongst the early founding faculty were Carver Mead, John Hopfield, David Van Essen, Geoffrey Fox, James Bower, Mark Konishi, John Allman, Ed Posner and Demetri Psaltis. In that year, the first external professor, Christof Koch, was hired.

Since 1990, about 110 graduate students have been awarded a PhD in CNS and 14 a MS in CNS. About two-thirds of CNS graduates pursued an academic career, with the remaining CNS graduates founding and/or joining start-up companies. Over this time, the average duration of PhD has been 5.6 years.

During this time, the executive officers of the CNS Program were John Hopfield, Demetri Psaltis, Christof Koch, and Pietro Perona. The current executive officer is Thanos Siapas. [7]

CNS faculty founded and co-founded a number of conferences and workshops:

Notable alumni

Related Research Articles

Computational neuroscience is a branch of neuroscience which employs mathematics, computer science, theoretical analysis and abstractions of the brain to understand the principles that govern the development, structure, physiology and cognitive abilities of the nervous system.

Neuromorphic computing is an approach to computing that is inspired by the structure and function of the human brain. A neuromorphic computer/chip is any device that uses physical artificial neurons to do computations. In recent times, the term neuromorphic has been used to describe analog, digital, mixed-mode analog/digital VLSI, and software systems that implement models of neural systems. Recent advances have even discovered ways to mimic the human nervous system through liquid solutions of chemical systems.

<span class="mw-page-title-main">John Hopfield</span> American scientist (born 1933)

John Joseph Hopfield is an American physicist and emeritus professor of Princeton University, most widely known for his study of associative neural networks in 1982. He is known for the development of the Hopfield network.

<span class="mw-page-title-main">Carver Mead</span> American scientist and engineer

Carver Andress Mead is an American scientist and engineer. He currently holds the position of Gordon and Betty Moore Professor Emeritus of Engineering and Applied Science at the California Institute of Technology (Caltech), having taught there for over 40 years.

Terrence Joseph Sejnowski is the Francis Crick Professor at the Salk Institute for Biological Studies where he directs the Computational Neurobiology Laboratory and is the director of the Crick-Jacobs center for theoretical and computational biology. He has performed pioneering research in neural networks and computational neuroscience.

The Harold Pender Award, initiated in 1972 and named after founding Dean Harold Pender, is given by the Faculty of the School of Engineering and Applied Science of the University of Pennsylvania to an outstanding member of the engineering profession who has achieved distinction by significant contributions to society. The Pender Award is the School of Engineering's highest honor.

<span class="mw-page-title-main">Stephen Grossberg</span> American scientist (born 1939)

Stephen Grossberg is a cognitive scientist, theoretical and computational psychologist, neuroscientist, mathematician, biomedical engineer, and neuromorphic technologist. He is the Wang Professor of Cognitive and Neural Systems and a Professor Emeritus of Mathematics & Statistics, Psychological & Brain Sciences, and Biomedical Engineering at Boston University.

Rahul Sarpeshkar is the Thomas E. Kurtz Professor and a professor of engineering, professor of physics, professor of microbiology & immunology, and professor of molecular and systems biology at Dartmouth. Sarpeshkar, whose interdisciplinary work is in bioengineering, electrical engineering, quantum physics, and biophysics, is the inaugural chair of the William H. Neukom cluster of computational science, which focuses on analog, quantum, and biological computation. The clusters, designed by faculty from across the institution to address major global challenges, are part of President Philip Hanlon's vision for strengthening academic excellence at Dartmouth. Prior to Dartmouth, Sarpeshkar was a tenured professor at the Massachusetts Institute of Technology and led the Analog Circuits and Biological Systems Group. He is now also a visiting scientist at MIT's Research Laboratory of Electronics.

Neuroinformatics is the emergent field that combines informatics and neuroscience. Neuroinformatics is related with neuroscience data and information processing by artificial neural networks. There are three main directions where neuroinformatics has to be applied:

<span class="mw-page-title-main">Misha Mahowald</span> American computational neuroscientist

Michelle Anne Mahowald was an American computational neuroscientist in the emerging field of neuromorphic engineering. In 1996 she was inducted into the Women in Technology International Hall of Fame for her development of the Silicon Eye and other computational systems. She died by suicide at age 33.

The Interdisciplinary Center for Neural Computation is a research center of the Hebrew University. It was established in 1992 to provide an inter-face for interactive research in Neurobiology, Physics and Applied Physics Computer Science and Psychophysics with the objective of increasing the understanding of how the brain works with specific focus on computational aspects of the nervous system. The center has facilities for studying and modeling the nervous system at its different levels, from single neuron computation to signal processing in small and large cortical networks, to the system and the behavioral level. This is backed up by 26 faculty members.

A physical neural network is a type of artificial neural network in which an electrically adjustable material is used to emulate the function of a neural synapse or a higher-order (dendritic) neuron model. "Physical" neural network is used to emphasize the reliance on physical hardware used to emulate neurons as opposed to software-based approaches. More generally the term is applicable to other artificial neural networks in which a memristor or other electrically adjustable resistance material is used to emulate a neural synapse.

A vision chip is an integrated circuit having both image sensing circuitry and image processing circuitry on the same die. The image sensing circuitry may be implemented using charge-coupled devices, active pixel sensor circuits, or any other light sensing mechanism. The image processing circuitry may be implemented using analog, digital, or mixed signal circuitry. One area of research is the use of neuromorphic engineering techniques to implement processing circuits inspired by biological neural systems. The output of a vision chip is generally a partially processed image or a high-level information signal revealing something about the observed scene. Although there is no standard definition of a vision chip, the processing performed may comprise anything from processing individual pixel values to performing complex image processing functions and outputting a single value or yes/no signal based on the scene.

Kwabena Adu Boahen is a Ghanaian-born Professor of Bioengineering and Electrical Engineering at Stanford University. He previously taught at the University of Pennsylvania.

The Bernstein Network is a research network in the field of computational neuroscience; this field brings together experimental approaches in neurobiology with theoretical models and computer simulations. It unites different scientific disciplines, such as physics, biology, mathematics, medical science, psychology, computer science, engineering and philosophy in the endeavor to understand how the brain functions. The close combination of neurobiological experiments with theoretical models and computer simulations allows scientists of the Bernstein Network to pursue innovative approaches with regard to one of the most complex structures nature has created in the course of evolution: the natural brain.

Tobias "Tobi" Delbrück is an American neuromorphic engineer at the University of Zurich and ETH Zurich, Switzerland. He was named Fellow of the Institute of Electrical and Electronics Engineers (IEEE) in 2014 "for contributions to neuromorphic visual sensors and processing".

<span class="mw-page-title-main">André van Schaik</span> Professor of electrical engineering

André van Schaik is a professor of electrical engineering at the Western Sydney University, and director of the International Centre for Neuromorphic Systems, in Penrith, New South Wales, Australia. He was named a Fellow of the Institute of Electrical and Electronics Engineers (IEEE) in 2014 "for contributions to neuromorphic circuits and systems".

Shih-Chii Liu is a professor at the University of Zürich. Her research interests include developing brain-inspired sensors, algorithms, and networks; and their neural electronic equivalents.

<span class="mw-page-title-main">Ila Fiete</span> American physicist

Ila Fiete is an Indian–American physicist and computational neuroscientist as well as a Professor in the Department of Brain and Cognitive Sciences within the McGovern Institute for Brain Research at the Massachusetts Institute of Technology. Fiete builds theoretical models and analyses neural data and to uncover how neural circuits perform computations and how the brain represents and manipulates information involved in memory and reasoning.

Ralph Etienne-Cummings is an academic in the field of electrical engineering. He is a professor of Electrical & Computer Engineering at Johns Hopkins University.

References

  1. C. Mead and L. Conway, Introduction to VLSI systems. Addison-Wesley Reading Mass. (1980)
  2. 1 2 Hopfield, J.J. Neural networks and physical systems with emergent collective computational abilities. Proc. NatL Acad. Sci. USA Vol. 79, pp. 2554-2558, April 1982
  3. 1 2 Shirley K. Cohen, Interview with Carver Mead. Archives of the California Institute of Technology. (PDF)
  4. C. Mead, Analog VLSI and neural systems. Addison-Wesley (1989)
  5. R.P. Feynman, Feynman Lectures on Computation. Tony Hey and Robin W. Allen ed. Perseus Books Group (2000) ISBN   0738202967
  6. D.J. Felleman, D.C. Van Essen. Distributed hierarchical processing in the primate cerebral cortex. Cerebral Cortex, 1 (1) (1991)
  7. "Contacts - Biology and Biological Engineering". Caltech. Archived from the original on 7 July 2024.

Further reading