Physical neural network

Last updated

A physical neural network is a type of artificial neural network in which an electrically adjustable material is used to emulate the function of a neural synapse or a higher-order (dendritic) neuron model. [1] "Physical" neural network is used to emphasize the reliance on physical hardware used to emulate neurons as opposed to software-based approaches. More generally the term is applicable to other artificial neural networks in which a memristor or other electrically adjustable resistance material is used to emulate a neural synapse. [2] [3]

Contents

Types of physical neural networks

ADALINE

In the 1960s Bernard Widrow and Ted Hoff developed ADALINE (Adaptive Linear Neuron) which used electrochemical cells called memistors (memory resistors) to emulate synapses of an artificial neuron. [4] The memistors were implemented as 3-terminal devices operating based on the reversible electroplating of copper such that the resistance between two of the terminals is controlled by the integral of the current applied via the third terminal. The ADALINE circuitry was briefly commercialized by the Memistor Corporation in the 1960s enabling some applications in pattern recognition. However, since the memistors were not fabricated using integrated circuit fabrication techniques the technology was not scalable and was eventually abandoned as solid-state electronics became mature. [5]

Analog VLSI

In 1989 Carver Mead published his book Analog VLSI and Neural Systems, [6] which spun off perhaps the most common variant of analog neural networks. The physical realization is implemented in analog VLSI. This is often implemented as field effect transistors in low inversion. Such devices can be modelled as translinear circuits. This is a technique described by Barrie Gilbert in several papers around mid 1970th, and in particular his Translinear Circuits from 1981. [7] [8] With this method circuits can be analyzed as a set of well-defined functions in steady-state, and such circuits assembled into complex networks.

Physical Neural Network

Alex Nugent describes a physical neural network as one or more nonlinear neuron-like nodes used to sum signals and nanoconnections formed from nanoparticles, nanowires, or nanotubes which determine the signal strength input to the nodes. [9] Alignment or self-assembly of the nanoconnections is determined by the history of the applied electric field performing a function analogous to neural synapses. Numerous applications [10] for such physical neural networks are possible. For example, a temporal summation device [11] can be composed of one or more nanoconnections having an input and an output thereof, wherein an input signal provided to the input causes one or more of the nanoconnection to experience an increase in connection strength thereof over time. Another example of a physical neural network is taught by U.S. Patent No. 7,039,619 [12] entitled "Utilized nanotechnology apparatus using a neural network, a solution and a connection gap," which issued to Alex Nugent by the U.S. Patent & Trademark Office on May 2, 2006. [13]

A further application of physical neural network is shown in U.S. Patent No. 7,412,428 entitled "Application of hebbian and anti-hebbian learning to nanotechnology-based physical neural networks," which issued on August 12, 2008. [14]

Nugent and Molter have shown that universal computing and general-purpose machine learning are possible from operations available through simple memristive circuits operating the AHaH plasticity rule. [15] More recently, it has been argued that also complex networks of purely memristive circuits can serve as neural networks. [16] [17]

Phase change neural network

In 2002, Stanford Ovshinsky described an analog neural computing medium in which phase-change material has the ability to cumulatively respond to multiple input signals. [18] An electrical alteration of the resistance of the phase change material is used to control the weighting of the input signals.

Memristive neural network

Greg Snider of HP Labs describes a system of cortical computing with memristive nanodevices. [19] The memristors (memory resistors) are implemented by thin film materials in which the resistance is electrically tuned via the transport of ions or oxygen vacancies within the film. DARPA's SyNAPSE project has funded IBM Research and HP Labs, in collaboration with the Boston University Department of Cognitive and Neural Systems (CNS), to develop neuromorphic architectures which may be based on memristive systems. [20]

Protonic artificial synapses

In 2022, researchers reported the development of nanoscale brain-inspired artificial synapses, using the ion proton (H+
), for 'analog deep learning'. [21] [22]

See also

Related Research Articles

<span class="mw-page-title-main">Neural network (machine learning)</span> Computational model used in machine learning, based on connected, hierarchical functions

In machine learning, a neural network is a model inspired by the structure and function of biological neural networks in animal brains.

<span class="mw-page-title-main">Artificial neuron</span> Mathematical function conceived as a crude model

An artificial neuron is a mathematical function conceived as a model of a biological neuron in a neural network. The artificial neuron is the elementary unit of an artificial neural network.

Neuromorphic computing is an approach to computing that is inspired by the structure and function of the human brain. A neuromorphic computer/chip is any device that uses physical artificial neurons to do computations. In recent times, the term neuromorphic has been used to describe analog, digital, mixed-mode analog/digital VLSI, and software systems that implement models of neural systems. Recent advances have even discovered ways to mimic the human nervous system through liquid solutions of chemical systems.

Recurrent neural networks (RNNs) are a class of artificial neural network commonly used for sequential data processing. Unlike feedforward neural networks, which process data in a single pass, RNNs process data across multiple time steps, making them well-adapted for modelling and processing text, speech, and time series.

<span class="mw-page-title-main">Neural circuit</span> Network or circuit of neurons

A neural circuit is a population of neurons interconnected by synapses to carry out a specific function when activated. Multiple neural circuits interconnect with one another to form large scale brain networks.

Unconventional computing is computing by any of a wide range of new or unusual methods.

The floating-gate MOSFET (FGMOS), also known as a floating-gate MOS transistor or floating-gate transistor, is a type of metal–oxide–semiconductor field-effect transistor (MOSFET) where the gate is electrically isolated, creating a floating node in direct current, and a number of secondary gates or inputs are deposited above the floating gate (FG) and are electrically isolated from it. These inputs are only capacitively connected to the FG. Since the FG is surrounded by highly resistive material, the charge contained in it remains unchanged for long periods of time, typically longer than 10 years in modern devices. Usually Fowler-Nordheim tunneling or hot-carrier injection mechanisms are used to modify the amount of charge stored in the FG.

<span class="mw-page-title-main">ADALINE</span> Early single-layer artificial neural network

ADALINE is an early single-layer artificial neural network and the name of the physical device that implemented it. It was developed by professor Bernard Widrow and his doctoral student Marcian Hoff at Stanford University in 1960. It is based on the perceptron and consists of weights, a bias, and a summation function. The weights and biases were implemented by rheostats, and later, memistors.

<span class="mw-page-title-main">Spiking neural network</span> Artificial neural network that mimics neurons

Spiking neural networks (SNNs) are artificial neural networks (ANN) that more closely mimic natural neural networks. These models leverage timing of discrete spikes as the main information carrier.

<span class="mw-page-title-main">Misha Mahowald</span> American computational neuroscientist

Michelle Anne Mahowald was an American computational neuroscientist in the emerging field of neuromorphic engineering. In 1996 she was inducted into the Women in Technology International Hall of Fame for her development of the Silicon Eye and other computational systems. She died by suicide at age 33.

<span class="mw-page-title-main">Memristor</span> Nonlinear two-terminal fundamental circuit element

A memristor is a non-linear two-terminal electrical component relating electric charge and magnetic flux linkage. It was described and named in 1971 by Leon Chua, completing a theoretical quartet of fundamental electrical components which also comprises the resistor, capacitor and inductor.

NOMFET is a nanoparticle organic memory field-effect transistor. The transistor is designed to mimic the feature of the human synapse known as plasticity, or the variation of the speed and strength of the signal going from neuron to neuron. The device uses gold nano-particles of about 5—20 nm set with pentacene to emulate the change in voltages and speed within the signal. This device uses charge trapping/detrapping in an array of gold nanoparticles (NPs) at the SiO2/pentacene interface to design a SYNAPSTOR (synapse transistor) mimicking the dynamic plasticity of a biological synapse. This device (memristor-like) mimics short-term plasticity (STP) and temporal correlation plasticity (STDP, spike-timing dependent plasticity), two "functions" at the basis of learning processes. A compact model was developed, and these organic synapstors were used to demonstrate an associative memory, which can be trained to present a pavlovian response. A recent report showed that these organic synapse-transistors (synapstor) are working at 1 volt and with a plasticity typical response time in the range 100-200 ms. The device also works in contact with an electrolyte (EGOS : electrolyte gated organic synapstor) and can be interfaced with biologic neurons.

<span class="mw-page-title-main">Memistor</span> Electronic circuit element

A memistor is a nanoelectric circuitry element used in parallel computing memory technology. Essentially, a resistor with memory able to perform logic operations and store information, it is a three-terminal implementation of the memristor.

<span class="mw-page-title-main">Massimiliano Di Ventra</span> American-Italian theoretical physicist

Massimiliano Di Ventra is an American-Italian theoretical physicist. Specializing in condensed-matter physics, he is the co-founder of MemComputing, Inc.

A voltage-controlled resistor (VCR) is a three-terminal active device with one input port and two output ports. The input-port voltage controls the value of the resistor between the output ports. VCRs are most often built with field-effect transistors (FETs). Two types of FETs are often used: the JFET and the MOSFET. There are both floating voltage-controlled resistors and grounded voltage-controlled resistors. Floating VCRs can be placed between two passive or active components. Grounded VCRs, the more common and less complicated design, require that one port of the voltage-controlled resistor be grounded.

The memtransistor is an experimental multi-terminal passive electronic component that might be used in the construction of artificial neural networks. It is a combination of the memristor and transistor technology. This technology is different from the 1T-1R approach since the devices are merged into one single entity. Multiple memristors can be embedded with a single transistor, enabling it to more accurately model a neuron with its multiple synaptic connections. A neural network produced from these would provide hardware-based artificial intelligence with a good foundation.

Electrochemical Random-Access Memory (ECRAM) is a type of non-volatile memory (NVM) with multiple levels per cell (MLC) designed for deep learning analog acceleration. An ECRAM cell is a three-terminal device composed of a conductive channel, an insulating electrolyte, an ionic reservoir, and metal contacts. The resistance of the channel is modulated by ionic exchange at the interface between the channel and the electrolyte upon application of an electric field. The charge-transfer process allows both for state retention in the absence of applied power, and for programming of multiple distinct levels, both differentiating ECRAM operation from that of a field-effect transistor (FET). The write operation is deterministic and can result in symmetrical potentiation and depression, making ECRAM arrays attractive for acting as artificial synaptic weights in physical implementations of artificial neural networks (ANN). The technological challenges include open circuit potential (OCP) and semiconductor foundry compatibility associated with energy materials. Universities, government laboratories, and corporate research teams have contributed to the development of ECRAM for analog computing. Notably, Sandia National Laboratories designed a lithium-based cell inspired by solid-state battery materials, Stanford University built an organic proton-based cell, and International Business Machines (IBM) demonstrated in-memory selector-free parallel programming for a logistic regression task in an array of metal-oxide ECRAM designed for insertion in the back end of line (BEOL). In 2022, researchers at Massachusetts Institute of Technology built an inorganic, CMOS-compatible protonic technology that achieved near-ideal modulation characteristics using nanosecond fast pulses

The Caravelli-Traversa-Di Ventra equation (CTDV) is a closed-form equation to the evolution of networks of memristors. It was derived by Francesco Caravelli, Fabio L. Traversa and Massimiliano Di Ventra to study the exact evolution of complex circuits made of resistances with memory (memristors).

Themistoklis "Themis" Prodromakis, FRSC, FBCS, FInstP, FIET, CEng is an electronic engineer. In 2022 he joined the School of Engineering at the University of Edinburgh as the Regius Professor of Engineering. Prodromakis is also Director of the Centre for Electronics Frontiers and holds an RAEng Chair in Emerging Technologies in AI hardware technologies. He is the founder and director of ArC Instruments that commercialises high-performance testing instruments for semiconductor technologies. He has been involved in developing emerging metal-oxide resistive random-access memory technologies with applications in embedded electronics. and biomedicine. His tool developments went hand in hand with the optimisation of solid-state device technologies, allowing him to demonstrate world-record performance in analogue memory, where single devices can store over 100 discernible memory states. He showcased the use of memristor technologies in a variety of applications: on-node bio-signal processors that compared to CMOS state-of-art achieve a 200 better compression efficiency and over 2-orders of magnitude power savings per channel; in-silico implementations of unsupervised learning, empowering the handling of big-unlabelled-data efficiently and robustly; and a novel microelectronics design paradigm that fuses the analogue and digital worlds and empowers energy efficient implementations of analogue reconfigurable gates. He was also the first to demonstrate a bio-hybrid network comprising real and artificial neurons that was linked via memristor synapse emulators over the internet.

A neural network is a group of interconnected units called neurons that send signals to one another. Neurons can be either biological cells or mathematical models. While individual neurons are simple, many of them together in a network can perform complex tasks. There are two main types of neural network.

References

  1. Lawrence, Celestine P. (2022), "Compact Modeling of Nanocluster Functionality as a Higher-Order Neuron", IEEE Transactions on Electron Devices, 69 (9): 5373–5376, Bibcode:2022ITED...69.5373L, doi:10.1109/TED.2022.3191956, S2CID   251340897
  2. "Cornell & NTT's Physical Neural Networks: A "Radical Alternative for Implementing Deep Neural Networks" That Enables Arbitrary Physical Systems Training | Synced". 27 May 2021.
  3. "Nano-spaghetti to solve neural network power consumption".
  4. Widrow, B.; Pierce, W. H.; Angell, J.B. (1961), "Birth, Life, and Death in Microelectronic Systems" (PDF), Technical Report No. 1552-2/1851-1
  5. Anderson, James; Rosenfeld, Edward (1998), Talking Nets: An Oral History of Neural Networks , MIT Press, ISBN   978-0-262-01167-9
  6. Mead, Carver. (1989). Analog VLSI and neural systems. Reading, Mass.: Addison-Wesley. ISBN   0-201-05992-4. OCLC   17954003.
  7. Gilbert, Barrie (1981), Translinear Circuits (Handout, pp. 81)
  8. Gilbert, Barrie (1999-12-27), "Translinear Circuits", Wiley Encyclopedia of Electrical and Electronics Engineering, John Wiley & Sons, Inc., doi:10.1002/047134608x.w2302, ISBN   0-471-34608-X
  9. U.S. patent 6,889,216
  10. U.S. Known Patents
  11. U.S. Patent No. 7,028,017
  12. "Utilized nanotechnology apparatus using a neutral network, a solution and a connection gap".
  13. "United States Patent: 8918353 - Methods and systems for feature extraction".
  14. "United States Patent: 9104975 - Memristor apparatus".
  15. Nugent, Michael Alexander; Molter, Timothy Wesley (2014). "AHaH Computing–From Metastable Switches to Attractors to Machine Learning". PLOS ONE. 9 (2): e85175. Bibcode:2014PLoSO...985175N. doi: 10.1371/journal.pone.0085175 . PMC   3919716 . PMID   24520315.
  16. Caravelli, F.; Traversa, F. L.; Di Ventra, M. (2017). "The complex dynamics of memristive circuits: analytical results and universal slow relaxation". Physical Review E. 95 (2): 022140. arXiv: 1608.08651 . Bibcode:2017PhRvE..95b2140C. doi:10.1103/PhysRevE.95.022140. PMID   28297937. S2CID   6758362.
  17. Caravelli, F. (2019). "Asymptotic behavior of memristive circuits". Entropy. 21 (8): 789. arXiv: 1712.07046 . Bibcode:2019Entrp..21..789C. doi: 10.3390/e21080789 . PMC   7515318 . PMID   33267502.
  18. U.S. patent 6,999,953
  19. Snider, Greg (2008), "Cortical computing with memristive nanodevices", Sci-DAC Review, 10: 58–65, archived from the original on 2016-05-16, retrieved 2009-10-26
  20. Caravelli, Francesco; Carbajal, Juan Pablo (2018), "Memristors for the curious outsiders", Technologies, 6 (4): 118, arXiv: 1812.03389 , Bibcode:2018arXiv181203389C, doi: 10.3390/technologies6040118 , S2CID   54464654
  21. "'Artificial synapse' could make neural networks work more like brains". New Scientist. Retrieved 21 August 2022.
  22. Onen, Murat; Emond, Nicolas; Wang, Baoming; Zhang, Difei; Ross, Frances M.; Li, Ju; Yildiz, Bilge; del Alamo, Jesús A. (29 July 2022). "Nanosecond protonic programmable resistors for analog deep learning" (PDF). Science. 377 (6605): 539–543. Bibcode:2022Sci...377..539O. doi:10.1126/science.abp8064. ISSN   0036-8075. PMID   35901152. S2CID   251159631.