Developer | Steve Furber |
---|---|
Product family | Manchester computers |
Type | Neuromorphic |
Release date | 2019 |
CPU | ARM968E-S @ 200 MHz |
Memory | 7 TB |
Successor | SpiNNaker 2 [1] |
Website | apt |
SpiNNaker (spiking neural network architecture) is a massively parallel, manycore supercomputer architecture designed by the Advanced Processor Technologies Research Group (APT) at the Department of Computer Science, University of Manchester. [2] It is composed of 57,600 processing nodes, each with 18 ARM9 processors (specifically ARM968) and 128 MB of mobile DDR SDRAM, totalling 1,036,800 cores and over 7 TB of RAM. [3] The computing platform is based on spiking neural networks, useful in simulating the human brain (see Human Brain Project). [4] [5] [6] [7] [8] [9] [10] [11] [12]
The completed design is housed in 10 19-inch racks, with each rack holding over 100,000 cores. [13] The cards holding the chips are held in 5 blade enclosures, and each core emulates 1,000 neurons. [13] In total, the goal is to simulate the behaviour of aggregates of up to a billion neurons in real time. [14] This machine requires about 100 kW from a 240 V supply and an air-conditioned environment. [15]
SpiNNaker is being used as one component of the neuromorphic computing platform for the Human Brain Project. [16] [17]
On 14 October 2018 the HBP announced that the million core milestone had been achieved. [18] [19]
On 24 September 2019 HBP announced that an 8 million euro grant, that will fund construction of the second generation machine, (called SpiNNcloud) has been given to TU Dresden. [20]
Computational neuroscience is a branch of neuroscience which employs mathematics, computer science, theoretical analysis and abstractions of the brain to understand the principles that govern the development, structure, physiology and cognitive abilities of the nervous system.
Bio-inspired computing, short for biologically inspired computing, is a field of study which seeks to solve computer science problems using models of biology. It relates to connectionism, social behavior, and emergence. Within computer science, bio-inspired computing relates to artificial intelligence and machine learning. Bio-inspired computing is a major subset of natural computation.
Neuromorphic computing is an approach to computing that is inspired by the structure and function of the human brain. A neuromorphic computer/chip is any device that uses physical artificial neurons to do computations. In recent times, the term neuromorphic has been used to describe analog, digital, mixed-mode analog/digital VLSI, and software systems that implement models of neural systems. The implementation of neuromorphic computing on the hardware level can be realized by oxide-based memristors, spintronic memories, threshold switches, transistors, among others. Training software-based neuromorphic systems of spiking neural networks can be achieved using error backpropagation, e.g., using Python based frameworks such as snnTorch, or using canonical learning rules from the biological learning literature, e.g., using BindsNet.
Stephen Byram Furber is a British computer scientist, mathematician and hardware engineer, and Emeritus ICL Professor of Computer Engineering in the Department of Computer Science at the University of Manchester, UK. After completing his education at the University of Cambridge, he spent the 1980s at Acorn Computers, where he was a principal designer of the BBC Micro and the ARM 32-bit RISC microprocessor. As of 2023, over 250 billion arm chips have been manufactured, powering much of the world's mobile computing and embedded systems, everything from sensors to smartphones to servers.
An optical neural network is a physical implementation of an artificial neural network with optical components. Early optical neural networks used a photorefractive Volume hologram to interconnect arrays of input neurons to arrays of output with synaptic weights in proportion to the multiplexed hologram's strength. Volume holograms were further multiplexed using spectral hole burning to add one dimension of wavelength to space to achieve four dimensional interconnects of two dimensional arrays of neural inputs and outputs. This research led to extensive research on alternative methods using the strength of the optical interconnect for implementing neuronal communications.
Spiking neural networks (SNNs) are artificial neural networks that more closely mimic natural neural networks. In addition to neuronal and synaptic state, SNNs incorporate the concept of time into their operating model. The idea is that neurons in the SNN do not transmit information at each propagation cycle, but rather transmit information only when a membrane potential—an intrinsic quality of the neuron related to its membrane electrical charge—reaches a specific value, called the threshold. When the membrane potential reaches the threshold, the neuron fires, and generates a signal that travels to other neurons which, in turn, increase or decrease their potentials in response to this signal. A neuron model that fires at the moment of threshold crossing is also called a spiking neuron model.
Neurorobotics is the combined study of neuroscience, robotics, and artificial intelligence. It is the science and technology of embodied autonomous neural systems. Neural systems include brain-inspired algorithms, computational models of biological neural networks and actual biological systems. Such neural systems can be embodied in machines with mechanic or any other forms of physical actuation. This includes robots, prosthetic or wearable systems but also, at smaller scale, micro-machines and, at the larger scales, furniture and infrastructures.
The Human Brain Project (HBP) was a large ten-year scientific research project, based on exascale supercomputers, that aimed to build a collaborative ICT-based scientific research infrastructure to allow researchers across Europe to advance knowledge in the fields of neuroscience, computing, and brain-related medicine.
The Manchester computers were an innovative series of stored-program electronic computers developed during the 30-year period between 1947 and 1977 by a small team at the University of Manchester, under the leadership of Tom Kilburn. They included the world's first stored-program computer, the world's first transistorised computer, and what was the world's fastest computer at the time of its inauguration in 1962.
Brain simulation is the concept of creating a functioning computer model of a brain or part of a brain. Brain simulation projects intend to contribute to a complete understanding of the brain, and eventually also assist the process of treating and diagnosing brain diseases.
A physical neural network is a type of artificial neural network in which an electrically adjustable material is used to emulate the function of a neural synapse or a higher-order (dendritic) neuron model. "Physical" neural network is used to emphasize the reliance on physical hardware used to emulate neurons as opposed to software-based approaches. More generally the term is applicable to other artificial neural networks in which a memristor or other electrically adjustable resistance material is used to emulate a neural synapse.
Dharmendra S. Modha is an Indian American manager and lead researcher of the Cognitive Computing group at IBM Almaden Research Center. He is known for his pioneering works in Artificial Intelligence and Mind Simulation. In November 2009, Modha announced at a supercomputing conference that his team had written a program that simulated a cat brain. He is the recipient of multiple honors, including the Gordon Bell Prize, given each year to recognize outstanding achievement in high-performance computing applications. In November 2012, Modha announced on his blog that using 96 Blue Gene/Q racks of the Lawrence Livermore National Laboratory Sequoia supercomputer, a combined IBM and LBNL team achieved an unprecedented scale of 2.084 billion neurosynaptic cores containing 530 billion neurons and 137 trillion synapses running only 1542× slower than real time. In August 2014 a paper describing the TrueNorth Architecture, "the first-ever production-scale 'neuromorphic' computer chip designed to work more like a mammalian brain than" a processor was published in the journal Science. TrueNorth project culminated in a 64 million neuron system for running deep neural network applications.
SyNAPSE is a DARPA program that aims to develop electronic neuromorphic machine technology, an attempt to build a new kind of cognitive computer with form, function, and architecture similar to the mammalian brain. Such artificial brains would be used in robots whose intelligence would scale with the size of the neural system in terms of the total number of neurons and synapses and their connectivity.
A Bayesian Confidence Propagation Neural Network (BCPNN) is an artificial neural network inspired by Bayes' theorem, which regards neural computation and processing as probabilistic inference. Neural unit activations represent probability ("confidence") in the presence of input features or categories, synaptic weights are based on estimated correlations and the spread of activation corresponds to calculating posterior probabilities. It was originally proposed by Anders Lansner and Örjan Ekeberg at KTH Royal Institute of Technology. This probabilistic neural network model can also be run in generative mode to produce spontaneous activations and temporal sequences.
Kwabena Adu Boahen is a Professor of Bioengineering and Electrical Engineering at Stanford University. He previously taught at the University of Pennsylvania.
A cognitive computer is a computer that hardwires artificial intelligence and machine learning algorithms into an integrated circuit that closely reproduces the behavior of the human brain. It generally adopts a neuromorphic engineering approach. Synonyms include neuromorphic chip and cognitive chip.
Zeroth is a platform for brain-inspired computing from Qualcomm. It is based around a neural processing unit (NPU) AI accelerator chip and a software API to interact with the platform. It makes a form of machine learning known as deep learning available to mobile devices. It is used for image and sound processing, including speech recognition. The software operates locally rather than as a cloud application.
An AI accelerator or neural processing unit is a class of specialized hardware accelerator or computer system designed to accelerate artificial intelligence and machine learning applications, including artificial neural networks and machine vision. Typical applications include algorithms for robotics, Internet of Things, and other data-intensive or sensor-driven tasks. They are often manycore designs and generally focus on low-precision arithmetic, novel dataflow architectures or in-memory computing capability. As of 2018, a typical AI integrated circuit chip contains billions of MOSFET transistors. A number of vendor-specific terms exist for devices in this category, and it is an emerging technology without a dominant design.
André van Schaik is a professor of electrical engineering at the Western Sydney University, and director of the International Centre for Neuromorphic Systems, in Penrith, New South Wales, Australia. He was named a Fellow of the Institute of Electrical and Electronics Engineers (IEEE) in 2014 "for contributions to neuromorphic circuits and systems".
Electrochemical Random-Access Memory (ECRAM) is a type of non-volatile memory (NVM) with multiple levels per cell (MLC) designed for deep learning analog acceleration. An ECRAM cell is a three-terminal device composed of a conductive channel, an insulating electrolyte, an ionic reservoir, and metal contacts. The resistance of the channel is modulated by ionic exchange at the interface between the channel and the electrolyte upon application of an electric field. The charge-transfer process allows both for state retention in the absence of applied power, and for programming of multiple distinct levels, both differentiating ECRAM operation from that of a field-effect transistor (FET). The write operation is deterministic and can result in symmetrical potentiation and depression, making ECRAM arrays attractive for acting as artificial synaptic weights in physical implementations of artificial neural networks (ANN). The technological challenges include open circuit potential (OCP) and semiconductor foundry compatibility associated with energy materials. Universities, government laboratories, and corporate research teams have contributed to the development of ECRAM for analog computing. Notably, Sandia National Laboratories designed a lithium-based cell inspired by solid-state battery materials, Stanford University built an organic proton-based cell, and International Business Machines (IBM) demonstrated in-memory selector-free parallel programming for a logistic regression task in an array of metal-oxide ECRAM designed for insertion in the back end of line (BEOL). In 2022, researchers at Massachusetts Institute of Technology built an inorganic, CMOS-compatible protonic technology that achieved near-ideal modulation characteristics using nanosecond fast pulses