A physical neural network is a type of artificial neural network in which an electrically adjustable material is used to emulate the function of a neural synapse or a higher-order (dendritic) neuron model. [1] "Physical" neural network is used to emphasize the reliance on physical hardware used to emulate neurons as opposed to software-based approaches. More generally the term is applicable to other artificial neural networks in which a memristor or other electrically adjustable resistance material is used to emulate a neural synapse. [2] [3]
In the 1960s Bernard Widrow and Ted Hoff developed ADALINE (Adaptive Linear Neuron) which used electrochemical cells called memistors (memory resistors) to emulate synapses of an artificial neuron. [4] The memistors were implemented as 3-terminal devices operating based on the reversible electroplating of copper such that the resistance between two of the terminals is controlled by the integral of the current applied via the third terminal. The ADALINE circuitry was briefly commercialized by the Memistor Corporation in the 1960s enabling some applications in pattern recognition. However, since the memistors were not fabricated using integrated circuit fabrication techniques the technology was not scalable and was eventually abandoned as solid-state electronics became mature. [5]
In 1989 Carver Mead published his book Analog VLSI and Neural Systems, [6] which spun off perhaps the most common variant of analog neural networks. The physical realization is implemented in analog VLSI. This is often implemented as field effect transistors in low inversion. Such devices can be modelled as translinear circuits. This is a technique described by Barrie Gilbert in several papers around mid 1970th, and in particular his Translinear Circuits from 1981. [7] [8] With this method circuits can be analyzed as a set of well-defined functions in steady-state, and such circuits assembled into complex networks.
Alex Nugent describes a physical neural network as one or more nonlinear neuron-like nodes used to sum signals and nanoconnections formed from nanoparticles, nanowires, or nanotubes which determine the signal strength input to the nodes. [9] Alignment or self-assembly of the nanoconnections is determined by the history of the applied electric field performing a function analogous to neural synapses. Numerous applications [10] for such physical neural networks are possible. For example, a temporal summation device [11] can be composed of one or more nanoconnections having an input and an output thereof, wherein an input signal provided to the input causes one or more of the nanoconnection to experience an increase in connection strength thereof over time. Another example of a physical neural network is taught by U.S. Patent No. 7,039,619 [12] entitled "Utilized nanotechnology apparatus using a neural network, a solution and a connection gap," which issued to Alex Nugent by the U.S. Patent & Trademark Office on May 2, 2006. [13]
A further application of physical neural network is shown in U.S. Patent No. 7,412,428 entitled "Application of hebbian and anti-hebbian learning to nanotechnology-based physical neural networks," which issued on August 12, 2008. [14]
Nugent and Molter have shown that universal computing and general-purpose machine learning are possible from operations available through simple memristive circuits operating the AHaH plasticity rule. [15] More recently, it has been argued that also complex networks of purely memristive circuits can serve as neural networks. [16] [17]
In 2002, Stanford Ovshinsky described an analog neural computing medium in which phase-change material has the ability to cumulatively respond to multiple input signals. [18] An electrical alteration of the resistance of the phase change material is used to control the weighting of the input signals.
Greg Snider of HP Labs describes a system of cortical computing with memristive nanodevices. [19] The memristors (memory resistors) are implemented by thin film materials in which the resistance is electrically tuned via the transport of ions or oxygen vacancies within the film. DARPA's SyNAPSE project has funded IBM Research and HP Labs, in collaboration with the Boston University Department of Cognitive and Neural Systems (CNS), to develop neuromorphic architectures which may be based on memristive systems. [20]
In 2022, researchers reported the development of nanoscale brain-inspired artificial synapses, using the ion proton (H+
), for 'analog deep learning'. [21] [22]
In machine learning, a neural network is a model inspired by the structure and function of biological neural networks in animal brains.
An artificial neuron is a mathematical function conceived as a model of a biological neuron in a neural network. The artificial neuron is the elementary unit of an artificial neural network.
Neuromorphic computing is an approach to computing that is inspired by the structure and function of the human brain. A neuromorphic computer/chip is any device that uses physical artificial neurons to do computations. In recent times, the term neuromorphic has been used to describe analog, digital, mixed-mode analog/digital VLSI, and software systems that implement models of neural systems. Recent advances have even discovered ways to mimic the human nervous system through liquid solutions of chemical systems.
Recurrent neural networks (RNNs) are a class of artificial neural network commonly used for sequential data processing. Unlike feedforward neural networks, which process data in a single pass, RNNs process data across multiple time steps, making them well-adapted for modelling and processing text, speech, and time series.
A neural circuit is a population of neurons interconnected by synapses to carry out a specific function when activated. Multiple neural circuits interconnect with one another to form large scale brain networks.
Unconventional computing is computing by any of a wide range of new or unusual methods.
The floating-gate MOSFET (FGMOS), also known as a floating-gate MOS transistor or floating-gate transistor, is a type of metal–oxide–semiconductor field-effect transistor (MOSFET) where the gate is electrically isolated, creating a floating node in direct current, and a number of secondary gates or inputs are deposited above the floating gate (FG) and are electrically isolated from it. These inputs are only capacitively connected to the FG. Since the FG is surrounded by highly resistive material, the charge contained in it remains unchanged for long periods of time, typically longer than 10 years in modern devices. Usually Fowler-Nordheim tunneling or hot-carrier injection mechanisms are used to modify the amount of charge stored in the FG.
ADALINE is an early single-layer artificial neural network and the name of the physical device that implemented it. It was developed by professor Bernard Widrow and his doctoral student Marcian Hoff at Stanford University in 1960. It is based on the perceptron and consists of weights, a bias, and a summation function. The weights and biases were implemented by rheostats, and later, memistors.
Spiking neural networks (SNNs) are artificial neural networks (ANN) that more closely mimic natural neural networks. These models leverage timing of discrete spikes as the main information carrier.
Michelle Anne Mahowald was an American computational neuroscientist in the emerging field of neuromorphic engineering. In 1996 she was inducted into the Women in Technology International Hall of Fame for her development of the Silicon Eye and other computational systems. She died by suicide at age 33.
A memristor is a non-linear two-terminal electrical component relating electric charge and magnetic flux linkage. It was described and named in 1971 by Leon Chua, completing a theoretical quartet of fundamental electrical components which also comprises the resistor, capacitor and inductor.
NOMFET is a nanoparticle organic memory field-effect transistor. The transistor is designed to mimic the feature of the human synapse known as plasticity, or the variation of the speed and strength of the signal going from neuron to neuron. The device uses gold nano-particles of about 5—20 nm set with pentacene to emulate the change in voltages and speed within the signal. This device uses charge trapping/detrapping in an array of gold nanoparticles (NPs) at the SiO2/pentacene interface to design a SYNAPSTOR (synapse transistor) mimicking the dynamic plasticity of a biological synapse. This device (memristor-like) mimics short-term plasticity (STP) and temporal correlation plasticity (STDP, spike-timing dependent plasticity), two "functions" at the basis of learning processes. A compact model was developed, and these organic synapstors were used to demonstrate an associative memory, which can be trained to present a pavlovian response. A recent report showed that these organic synapse-transistors (synapstor) are working at 1 volt and with a plasticity typical response time in the range 100-200 ms. The device also works in contact with an electrolyte (EGOS : electrolyte gated organic synapstor) and can be interfaced with biologic neurons.
A memistor is a nanoelectric circuitry element used in parallel computing memory technology. Essentially, a resistor with memory able to perform logic operations and store information, it is a three-terminal implementation of the memristor.
Massimiliano Di Ventra is an American-Italian theoretical physicist. Specializing in condensed-matter physics, he is the co-founder of MemComputing, Inc.
A voltage-controlled resistor (VCR) is a three-terminal active device with one input port and two output ports. The input-port voltage controls the value of the resistor between the output ports. VCRs are most often built with field-effect transistors (FETs). Two types of FETs are often used: the JFET and the MOSFET. There are both floating voltage-controlled resistors and grounded voltage-controlled resistors. Floating VCRs can be placed between two passive or active components. Grounded VCRs, the more common and less complicated design, require that one port of the voltage-controlled resistor be grounded.
The memtransistor is an experimental multi-terminal passive electronic component that might be used in the construction of artificial neural networks. It is a combination of the memristor and transistor technology. This technology is different from the 1T-1R approach since the devices are merged into one single entity. Multiple memristors can be embedded with a single transistor, enabling it to more accurately model a neuron with its multiple synaptic connections. A neural network produced from these would provide hardware-based artificial intelligence with a good foundation.
Electrochemical Random-Access Memory (ECRAM) is a type of non-volatile memory (NVM) with multiple levels per cell (MLC) designed for deep learning analog acceleration. An ECRAM cell is a three-terminal device composed of a conductive channel, an insulating electrolyte, an ionic reservoir, and metal contacts. The resistance of the channel is modulated by ionic exchange at the interface between the channel and the electrolyte upon application of an electric field. The charge-transfer process allows both for state retention in the absence of applied power, and for programming of multiple distinct levels, both differentiating ECRAM operation from that of a field-effect transistor (FET). The write operation is deterministic and can result in symmetrical potentiation and depression, making ECRAM arrays attractive for acting as artificial synaptic weights in physical implementations of artificial neural networks (ANN). The technological challenges include open circuit potential (OCP) and semiconductor foundry compatibility associated with energy materials. Universities, government laboratories, and corporate research teams have contributed to the development of ECRAM for analog computing. Notably, Sandia National Laboratories designed a lithium-based cell inspired by solid-state battery materials, Stanford University built an organic proton-based cell, and International Business Machines (IBM) demonstrated in-memory selector-free parallel programming for a logistic regression task in an array of metal-oxide ECRAM designed for insertion in the back end of line (BEOL). In 2022, researchers at Massachusetts Institute of Technology built an inorganic, CMOS-compatible protonic technology that achieved near-ideal modulation characteristics using nanosecond fast pulses
The Caravelli-Traversa-Di Ventra equation (CTDV) is a closed-form equation to the evolution of networks of memristors. It was derived by Francesco Caravelli, Fabio L. Traversa and Massimiliano Di Ventra to study the exact evolution of complex circuits made of resistances with memory (memristors).
Themistoklis "Themis" Prodromakis, FRSC, FBCS, FInstP, FIET, CEng is an electronic engineer. In 2022 he joined the School of Engineering at the University of Edinburgh as the Regius Professor of Engineering. Prodromakis is also Director of the Centre for Electronics Frontiers and holds an RAEng Chair in Emerging Technologies in AI hardware technologies. He is the founder and director of ArC Instruments that commercialises high-performance testing instruments for semiconductor technologies. He has been involved in developing emerging metal-oxide resistive random-access memory technologies with applications in embedded electronics. and biomedicine. His tool developments went hand in hand with the optimisation of solid-state device technologies, allowing him to demonstrate world-record performance in analogue memory, where single devices can store over 100 discernible memory states. He showcased the use of memristor technologies in a variety of applications: on-node bio-signal processors that compared to CMOS state-of-art achieve a 200 better compression efficiency and over 2-orders of magnitude power savings per channel; in-silico implementations of unsupervised learning, empowering the handling of big-unlabelled-data efficiently and robustly; and a novel microelectronics design paradigm that fuses the analogue and digital worlds and empowers energy efficient implementations of analogue reconfigurable gates. He was also the first to demonstrate a bio-hybrid network comprising real and artificial neurons that was linked via memristor synapse emulators over the internet.
A neural network is a group of interconnected units called neurons that send signals to one another. Neurons can be either biological cells or mathematical models. While individual neurons are simple, many of them together in a network can perform complex tasks. There are two main types of neural network.