Neuromorphic engineering

Last updated

Neuromorphic computing is an approach to computing that is inspired by the structure and function of the human brain. [1] [2] A neuromorphic computer/chip is any device that uses physical artificial neurons to do computations. [3] [4] In recent times, the term neuromorphic has been used to describe analog, digital, mixed-mode analog/digital VLSI, and software systems that implement models of neural systems (for perception, motor control, or multisensory integration). The implementation of neuromorphic computing on the hardware level can be realized by oxide-based memristors, [5] spintronic memories, threshold switches, transistors, [6] [4] among others. Training software-based neuromorphic systems of spiking neural networks can be achieved using error backpropagation, e.g., using Python based frameworks such as snnTorch, [7] or using canonical learning rules from the biological learning literature, e.g., using BindsNet. [8]

Contents

A key aspect of neuromorphic engineering is understanding how the morphology of individual neurons, circuits, applications, and overall architectures creates desirable computations, affects how information is represented, influences robustness to damage, incorporates learning and development, adapts to local change (plasticity), and facilitates evolutionary change.

Neuromorphic engineering is an interdisciplinary subject that takes inspiration from biology, physics, mathematics, computer science, and electronic engineering [4] to design artificial neural systems, such as vision systems, head-eye systems, auditory processors, and autonomous robots, whose physical architecture and design principles are based on those of biological nervous systems. [9] One of the first applications for neuromorphic engineering was proposed by Carver Mead [10] in the late 1980s.

Neurological inspiration

Neuromorphic engineering is for now set apart by the inspiration it takes from what we know about the structure and operations of the brain. Neuromorphic engineering translates what we know about the brain's function into computer systems. Work has mostly focused on replicating the analog nature of biological computation and the role of neurons in cognition.

The biological processes of neurons and their synapses are dauntingly complex, and thus very difficult to artificially simulate. A key feature of biological brains is that all of the processing in neurons uses analog chemical signals. This makes it hard to replicate brains in computers because the current generation of computers is completely digital. However, the characteristics of these parts can be abstracted into mathematical functions that closely capture the essence of the neuron's operations.

The goal of neuromorphic computing is not to perfectly mimic the brain and all of its functions, but instead to extract what is known of its structure and operations to be used in a practical computing system. No neuromorphic system will claim nor attempt to reproduce every element of neurons and synapses, but all adhere to the idea that computation is highly distributed throughout a series of small computing elements analogous to a neuron. While this sentiment is standard, researchers chase this goal with different methods. [11]

Examples

As early as 2006, researchers at Georgia Tech published a field programmable neural array. [12] This chip was the first in a line of increasingly complex arrays of floating gate transistors that allowed programmability of charge on the gates of MOSFETs to model the channel-ion characteristics of neurons in the brain and was one of the first cases of a silicon programmable array of neurons.

In November 2011, a group of MIT researchers created a computer chip that mimics the analog, ion-based communication in a synapse between two neurons using 400 transistors and standard CMOS manufacturing techniques. [13] [14]

In June 2012, spintronic researchers at Purdue University presented a paper on the design of a neuromorphic chip using lateral spin valves and memristors. They argue that the architecture works similarly to neurons and can therefore be used to test methods of reproducing the brain's processing. In addition, these chips are significantly more energy-efficient than conventional ones. [15]

Research at HP Labs on Mott memristors has shown that while they can be non-volatile, the volatile behavior exhibited at temperatures significantly below the phase transition temperature can be exploited to fabricate a neuristor, [16] a biologically-inspired device that mimics behavior found in neurons. [16] In September 2013, they presented models and simulations that show how the spiking behavior of these neuristors can be used to form the components required for a Turing machine. [17]

Neurogrid, built by Brains in Silicon at Stanford University, [18] is an example of hardware designed using neuromorphic engineering principles. The circuit board is composed of 16 custom-designed chips, referred to as NeuroCores. Each NeuroCore's analog circuitry is designed to emulate neural elements for 65536 neurons, maximizing energy efficiency. The emulated neurons are connected using digital circuitry designed to maximize spiking throughput. [19] [20]

A research project with implications for neuromorphic engineering is the Human Brain Project that is attempting to simulate a complete human brain in a supercomputer using biological data. It is made up of a group of researchers in neuroscience, medicine, and computing. [21] Henry Markram, the project's co-director, has stated that the project proposes to establish a foundation to explore and understand the brain and its diseases, and to use that knowledge to build new computing technologies. The three primary goals of the project are to better understand how the pieces of the brain fit and work together, to understand how to objectively diagnose and treat brain diseases, and to use the understanding of the human brain to develop neuromorphic computers. That the simulation of a complete human brain will require a powerful supercomputer encourages the current focus on neuromorphic computers. [22] $1.3 billion has been allocated to the project by The European Commission. [23]

Other research with implications for neuromorphic engineering involve the BRAIN Initiative [24] and the TrueNorth chip from IBM. [25] Neuromorphic devices have also been demonstrated using nanocrystals, nanowires, and conducting polymers. [26] There also is development of a memristive device for quantum neuromorphic architectures. [27] In 2022, researchers at MIT have reported the development of brain-inspired artificial synapses, using the ion proton (H+
), for 'analog deep learning'. [28] [29]

Intel unveiled its neuromorphic research chip, called "Loihi", in October 2017. The chip uses an asynchronous spiking neural network (SNN) to implement adaptive self-modifying event-driven fine-grained parallel computations used to implement learning and inference with high efficiency. [30] [31]

IMEC, a Belgium-based nanoelectronics research center, demonstrated the world's first self-learning neuromorphic chip. The brain-inspired chip, based on OxRAM technology, has the capability of self-learning and has been demonstrated to have the ability to compose music. [32] IMEC released the 30-second tune composed by the prototype. The chip was sequentially loaded with songs in the same time signature and style. The songs were old Belgian and French flute minuets, from which the chip learned the rules at play and then applied them. [33]

The Blue Brain Project, led by Henry Markram, aims to build biologically detailed digital reconstructions and simulations of the mouse brain. The Blue Brain Project has created in silico models of rodent brains, while attempting to replicate as many details about its biology as possible. The supercomputer-based simulations offer new perspectives on understanding the structure and functions of the brain.

The European Union funded a series of projects at the University of Heidelberg, which led to the development of BrainScaleS (brain-inspired multiscale computation in neuromorphic hybrid systems), a hybrid analog neuromorphic supercomputer located at Heidelberg University, Germany. It was developed as part of the Human Brain Project neuromorphic computing platform and is the complement to the SpiNNaker supercomputer (which is based on digital technology). The architecture used in BrainScaleS mimics biological neurons and their connections on a physical level; additionally, since the components are made of silicon, these model neurons operate on average 864 times (24 hours of real time is 100 seconds in the machine simulation) that of their biological counterparts. [34]

In 2019, the European Union funded the project "Neuromorphic quantum computing" [35] exploring the use of neuromorphic computing to perform quantum operations. Neuromorphic quantum computing [36] (abbreviated as 'n.quantum computing') is an unconventional computing type of computing that uses neuromorphic computing to perform quantum operations. [37] [38] It was suggested that quantum algorithms, which are algorithms that run on a realistic model of quantum computation, can be computed equally efficiently with neuromorphic quantum computing. [39] [40] [41] [42] [43] Both, traditional quantum computing and neuromorphic quantum computing are physics-based unconventional computing approaches to computations and do not follow the von Neumann architecture. They both construct a system (a circuit) that represents the physical problem at hand, and then leverage their respective physics properties of the system to seek the "minimum". Neuromorphic quantum computing and quantum computing share similar physical properties during computation. [43] [44]

Brainchip announced in October 2021 that it was taking orders for its Akida AI Processor Development Kits [45] and in January 2022 that it was taking orders for its Akida AI Processor PCIe boards, [46] making it the world's first commercially available neuromorphic processor.

Neuromemristive systems

Neuromemristive systems are a subclass of neuromorphic computing systems that focuses on the use of memristors to implement neuroplasticity. While neuromorphic engineering focuses on mimicking biological behavior, neuromemristive systems focus on abstraction. [47] For example, a neuromemristive system may replace the details of a cortical microcircuit's behavior with an abstract neural network model. [48]

There exist several neuron inspired threshold logic functions [5] implemented with memristors that have applications in high level pattern recognition applications. Some of the applications reported recently include speech recognition, [49] face recognition [50] and object recognition. [51] They also find applications in replacing conventional digital logic gates. [52] [53]

For (quasi)ideal passive memristive circuits, the evolution of the memristive memories can be written in a closed form (Caravelli–Traversa–Di Ventra equation): [54] [55]

as a function of the properties of the physical memristive network and the external sources. The equation is valid for the case of the Williams-Strukov original toy model, as in the case of ideal memristors, . However, the hypothesis of the existence of an ideal memristor is debatable. [56] In the equation above, is the "forgetting" time scale constant, typically associated to memory volatility, while is the ratio of off and on values of the limit resistances of the memristors, is the vector of the sources of the circuit and is a projector on the fundamental loops of the circuit. The constant has the dimension of a voltage and is associated to the properties of the memristor; its physical origin is the charge mobility in the conductor. The diagonal matrix and vector and respectively, are instead the internal value of the memristors, with values between 0 and 1. This equation thus requires adding extra constraints on the memory values in order to be reliable.

It has been recently shown that the equation above exhibits tunneling phenomena and used to study Lyapunov functions. [57] [55]

Neuromorphic sensors

The concept of neuromorphic systems can be extended to sensors (not just to computation). An example of this applied to detecting light is the retinomorphic sensor or, when employed in an array, the event camera. An event camera's pixels all register changes in brightness levels individually, which makes these cameras comparable to human eyesight in their theoretical power consumption [58] . In 2022, researchers from the Max Planck Institute for Polymer Research reported an organic artificial spiking neuron that exhibits the signal diversity of biological neurons while operating in the biological wetware, thus enabling in-situ neuromorphic sensing and biointerfacing applications. [59] [60]

Military applications

The Joint Artificial Intelligence Center, a branch of the U.S. military, is a center dedicated to the procurement and implementation of AI software and neuromorphic hardware for combat use. Specific applications include smart headsets/goggles and robots. JAIC intends to rely heavily on neuromorphic technology to connect "every sensor (to) every shooter" within a network of neuromorphic-enabled units.

While the interdisciplinary concept of neuromorphic engineering is relatively new, many of the same ethical considerations apply to neuromorphic systems as apply to human-like machines and artificial intelligence in general. However, the fact that neuromorphic systems are designed to mimic a human brain gives rise to unique ethical questions surrounding their usage.

However, the practical debate is that neuromorphic hardware as well as artificial "neural networks" are immensely simplified models of how the brain operates or processes information at a much lower complexity in terms of size and functional technology and a much more regular structure in terms of connectivity. Comparing neuromorphic chips to the brain is a very crude comparison similar to comparing a plane to a bird just because they both have wings and a tail. The fact is that biological neural cognitive systems are many orders of magnitude more energy- and compute-efficient than current state-of-the-art AI and neuromorphic engineering is an attempt to narrow this gap by inspiring from the brain's mechanism just like many engineering designs have bio-inspired features.

Social concerns

Significant ethical limitations may be placed on neuromorphic engineering due to public perception. [61] Special Eurobarometer 382: Public Attitudes Towards Robots, a survey conducted by the European Commission, found that 60% of European Union citizens wanted a ban of robots in the care of children, the elderly, or the disabled. Furthermore, 34% were in favor of a ban on robots in education, 27% in healthcare, and 20% in leisure. The European Commission classifies these areas as notably “human.” The report cites increased public concern with robots that are able to mimic or replicate human functions. Neuromorphic engineering, by definition, is designed to replicate the function of the human brain. [62]

The social concerns surrounding neuromorphic engineering are likely to become even more profound in the future. The European Commission found that EU citizens between the ages of 15 and 24 are more likely to think of robots as human-like (as opposed to instrument-like) than EU citizens over the age of 55. When presented an image of a robot that had been defined as human-like, 75% of EU citizens aged 15–24 said it corresponded with the idea they had of robots while only 57% of EU citizens over the age of 55 responded the same way. The human-like nature of neuromorphic systems, therefore, could place them in the categories of robots many EU citizens would like to see banned in the future. [62]

Personhood

As neuromorphic systems have become increasingly advanced, some scholars[ who? ] have advocated for granting personhood rights to these systems. Daniel Lim, a critic of technology development in the Human Brain Project, which aims to advance brain-inspired computing, has argued that advancement in neuromorphic computing could lead to machine consciousness or personhood. [63] If these systems are to be treated as people, then many tasks humans perform using neuromorphic systems, including their termination, may be morally impermissible as these acts would violate their autonomy. [63]

Ownership and property rights

There is significant legal debate around property rights and artificial intelligence. In Acohs Pty Ltd v. Ucorp Pty Ltd, Justice Christopher Jessup of the Federal Court of Australia found that the source code for Material Safety Data Sheets could not be copyrighted as it was generated by a software interface rather than a human author. [64] The same question may apply to neuromorphic systems: if a neuromorphic system successfully mimics a human brain and produces a piece of original work, who, if anyone, should be able to claim ownership of the work? [65]

See also

Related Research Articles

<span class="mw-page-title-main">Neural network (machine learning)</span> Computational model used in machine learning, based on connected, hierarchical functions

In machine learning, a neural network is a model inspired by the neuronal organization found in the biological neural networks in animal brains.

Computational neuroscience is a branch of neuroscience which employs mathematics, computer science, theoretical analysis and abstractions of the brain to understand the principles that govern the development, structure, physiology and cognitive abilities of the nervous system.

An artificial neuron is a mathematical function conceived as a model of biological neurons in a neural network. Artificial neurons are the elementary units of artificial neural networks. The artificial neuron is a function that receives one or more inputs, applies weights to these inputs, and sums them to produce an output.

Bio-inspired computing, short for biologically inspired computing, is a field of study which seeks to solve computer science problems using models of biology. It relates to connectionism, social behavior, and emergence. Within computer science, bio-inspired computing relates to artificial intelligence and machine learning. Bio-inspired computing is a major subset of natural computation.

<span class="mw-page-title-main">Optical neural network</span>

An optical neural network is a physical implementation of an artificial neural network with optical components. Early optical neural networks used a photorefractive Volume hologram to interconnect arrays of input neurons to arrays of output with synaptic weights in proportion to the multiplexed hologram's strength. Volume holograms were further multiplexed using spectral hole burning to add one dimension of wavelength to space to achieve four dimensional interconnects of two dimensional arrays of neural inputs and outputs. This research led to extensive research on alternative methods using the strength of the optical interconnect for implementing neuronal communications.

A recurrent neural network (RNN) is one of the two broad types of artificial neural network, characterized by direction of the flow of information between its layers. In contrast to the uni-directional feedforward neural network, it is a bi-directional artificial neural network, meaning that it allows the output from some nodes to affect subsequent input to the same nodes. Their ability to use internal state (memory) to process arbitrary sequences of inputs makes them applicable to tasks such as unsegmented, connected handwriting recognition or speech recognition. The term "recurrent neural network" is used to refer to the class of networks with an infinite impulse response, whereas "convolutional neural network" refers to the class of finite impulse response. Both classes of networks exhibit temporal dynamic behavior. A finite impulse recurrent network is a directed acyclic graph that can be unrolled and replaced with a strictly feedforward neural network, while an infinite impulse recurrent network is a directed cyclic graph that can not be unrolled.

<span class="mw-page-title-main">Wetware computer</span> Computer composed of organic material

A wetware computer is an organic computer composed of organic material "wetware" such as "living" neurons. Wetware computers composed of neurons are different than conventional computers because they use biological materials, and offer the possibility of substantially more energy-efficient computing. While a wetware computer is still largely conceptual, there has been limited success with construction and prototyping, which has acted as a proof of the concept's realistic application to computing in the future. The most notable prototypes have stemmed from the research completed by biological engineer William Ditto during his time at the Georgia Institute of Technology. His work constructing a simple neurocomputer capable of basic addition from leech neurons in 1999 was a significant discovery for the concept. This research was a primary example driving interest in creating these artificially constructed, but still organic brains.

<span class="mw-page-title-main">Quantum neural network</span> Quantum Mechanics in Neural Networks

Quantum neural networks are computational neural network models which are based on the principles of quantum mechanics. The first ideas on quantum neural computation were published independently in 1995 by Subhash Kak and Ron Chrisley, engaging with the theory of quantum mind, which posits that quantum effects play a role in cognitive function. However, typical research in quantum neural networks involves combining classical artificial neural network models with the advantages of quantum information in order to develop more efficient algorithms. One important motivation for these investigations is the difficulty to train classical neural networks, especially in big data applications. The hope is that features of quantum computing such as quantum parallelism or the effects of interference and entanglement can be used as resources. Since the technological implementation of a quantum computer is still in a premature stage, such quantum neural network models are mostly theoretical proposals that await their full implementation in physical experiments.

Unconventional computing is computing by any of a wide range of new or unusual methods. It is also known as alternative computing.

<span class="mw-page-title-main">Spiking neural network</span> Artificial neural network that mimics neurons

Spiking neural networks (SNNs) are artificial neural networks (ANN) that more closely mimic natural neural networks. In addition to neuronal and synaptic state, SNNs incorporate the concept of time into their operating model. The idea is that neurons in the SNN do not transmit information at each propagation cycle, but rather transmit information only when a membrane potential—an intrinsic quality of the neuron related to its membrane electrical charge—reaches a specific value, called the threshold. When the membrane potential reaches the threshold, the neuron fires, and generates a signal that travels to other neurons which, in turn, increase or decrease their potentials in response to this signal. A neuron model that fires at the moment of threshold crossing is also called a spiking neuron model.

Reservoir computing is a framework for computation derived from recurrent neural network theory that maps input signals into higher dimensional computational spaces through the dynamics of a fixed, non-linear system called a reservoir. After the input signal is fed into the reservoir, which is treated as a "black box," a simple readout mechanism is trained to read the state of the reservoir and map it to the desired output. The first key benefit of this framework is that training is performed only at the readout stage, as the reservoir dynamics are fixed. The second is that the computational power of naturally available systems, both classical and quantum mechanical, can be used to reduce the effective computational cost.

<span class="mw-page-title-main">Memristor</span> Nonlinear two-terminal fundamental circuit element

A memristor is a non-linear two-terminal electrical component relating electric charge and magnetic flux linkage. It was described and named in 1971 by Leon Chua, completing a theoretical quartet of fundamental electrical components which also comprises the resistor, capacitor and inductor.

A physical neural network is a type of artificial neural network in which an electrically adjustable material is used to emulate the function of a neural synapse or a higher-order (dendritic) neuron model. "Physical" neural network is used to emphasize the reliance on physical hardware used to emulate neurons as opposed to software-based approaches. More generally the term is applicable to other artificial neural networks in which a memristor or other electrically adjustable resistance material is used to emulate a neural synapse.

NOMFET is a nanoparticle organic memory field-effect transistor. The transistor is designed to mimic the feature of the human synapse known as plasticity, or the variation of the speed and strength of the signal going from neuron to neuron. The device uses gold nano-particles of about 5—20 nm set with pentacene to emulate the change in voltages and speed within the signal. This device uses charge trapping/detrapping in an array of gold nanoparticles (NPs) at the SiO2/pentacene interface to design a SYNAPSTOR (synapse transistor) mimicking the dynamic plasticity of a biological synapse. This device (memristor-like) mimics short-term plasticity (STP) and temporal correlation plasticity (STDP, spike-timing dependent plasticity), two "functions" at the basis of learning processes. A compact model was developed, and these organic synapstors were used to demonstrate an associative memory, which can be trained to present a pavlovian response. A recent report showed that these organic synapse-transistors (synapstor) are working at 1 volt and with a plasticity typical response time in the range 100-200 ms. The device also works in contact with an electrolyte (EGOS : electrolyte gated organic synapstor) and can be interfaced with biologic neurons.

<span class="mw-page-title-main">Massimiliano Versace</span>

Massimiliano Versace is the co-founder and the CEO of Neurala Inc, a Boston-based company building Artificial Intelligence emulating brain function in software and used in automating the process of visual inspection in manufacturing. He is also the founding Director of the Boston University Neuromorphics Lab. Massimiliano Versace is a Fulbright scholar and holds two PhD in Experimental Psychology from the University of Trieste, Italy and Cognitive and Neural Systems from Boston University, USA. He obtained his BSc from the University of Trieste, Italy.

Kwabena Adu Boahen is a Ghanaian-born Professor of Bioengineering and Electrical Engineering at Stanford University. He previously taught at the University of Pennsylvania.

<span class="mw-page-title-main">SpiNNaker</span>

SpiNNaker is a massively parallel, manycore supercomputer architecture designed by the Advanced Processor Technologies Research Group (APT) at the Department of Computer Science, University of Manchester. It is composed of 57,600 processing nodes, each with 18 ARM9 processors and 128 MB of mobile DDR SDRAM, totalling 1,036,800 cores and over 7 TB of RAM. The computing platform is based on spiking neural networks, useful in simulating the human brain.

A cognitive computer is a computer that hardwires artificial intelligence and machine learning algorithms into an integrated circuit that closely reproduces the behavior of the human brain. It generally adopts a neuromorphic engineering approach. Synonyms include neuromorphic chip and cognitive chip.

<span class="mw-page-title-main">Quantum machine learning</span> Interdisciplinary research area at the intersection of quantum physics and machine learning

Quantum machine learning is the integration of quantum algorithms within machine learning programs.

The Caravelli-Traversa-Di Ventra equation (CTDV) is a closed-form equation to the evolution of networks of memristors. It was derived by F. Caravelli, F. Traversa and Massimiliano Di Ventra to study the exact evolution of complex circuits made of resistances with memory (memristors).

References

  1. Ham, Donhee; Park, Hongkun; Hwang, Sungwoo; Kim, Kinam (2021). "Neuromorphic electronics based on copying and pasting the brain". Nature Electronics. 4 (9): 635–644. doi:10.1038/s41928-021-00646-1. ISSN   2520-1131. S2CID   240580331.
  2. van de Burgt, Yoeri; Lubberman, Ewout; Fuller, Elliot J.; Keene, Scott T.; Faria, Grégorio C.; Agarwal, Sapan; Marinella, Matthew J.; Alec Talin, A.; Salleo, Alberto (April 2017). "A non-volatile organic electrochemical device as a low-voltage artificial synapse for neuromorphic computing". Nature Materials. 16 (4): 414–418. Bibcode:2017NatMa..16..414V. doi:10.1038/nmat4856. ISSN   1476-4660. PMID   28218920.
  3. Mead, Carver (1990). "Neuromorphic electronic systems" (PDF). Proceedings of the IEEE. 78 (10): 1629–1636. doi:10.1109/5.58356. S2CID   1169506.
  4. 1 2 3 Rami A. Alzahrani; Alice C. Parker (July 2020). Neuromorphic Circuits With Neural Modulation Enhancing the Information Content of Neural Signaling. International Conference on Neuromorphic Systems 2020. pp. 1–8. doi: 10.1145/3407197.3407204 . S2CID   220794387.
  5. 1 2 Maan, A. K.; Jayadevi, D. A.; James, A. P. (January 1, 2016). "A Survey of Memristive Threshold Logic Circuits". IEEE Transactions on Neural Networks and Learning Systems. PP (99): 1734–1746. arXiv: 1604.07121 . Bibcode:2016arXiv160407121M. doi:10.1109/TNNLS.2016.2547842. ISSN   2162-237X. PMID   27164608. S2CID   1798273.
  6. Zhou, You; Ramanathan, S. (August 1, 2015). "Mott Memory and Neuromorphic Devices". Proceedings of the IEEE. 103 (8): 1289–1310. doi:10.1109/JPROC.2015.2431914. ISSN   0018-9219. S2CID   11347598.
  7. Eshraghian, Jason K.; Ward, Max; Neftci, Emre; Wang, Xinxin; Lenz, Gregor; Dwivedi, Girish; Bennamoun, Mohammed; Jeong, Doo Seok; Lu, Wei D. (October 1, 2021). "Training Spiking Neural Networks Using Lessons from Deep Learning". arXiv: 2109.12894 .
  8. "Hananel-Hazan/bindsnet: Simulation of spiking neural networks (SNNs) using PyTorch". GitHub . March 31, 2020.
  9. Boddhu, S. K.; Gallagher, J. C. (2012). "Qualitative Functional Decomposition Analysis of Evolved Neuromorphic Flight Controllers". Applied Computational Intelligence and Soft Computing. 2012: 1–21. doi: 10.1155/2012/705483 .
  10. Mead, Carver A.; Mahowald, M. A. (January 1, 1988). "A silicon model of early visual processing". Neural Networks. 1 (1): 91–97. doi:10.1016/0893-6080(88)90024-X. ISSN   0893-6080.
  11. Furber, Steve (2016). "Large-scale neuromorphic computing systems". Journal of Neural Engineering. 13 (5): 1–15. Bibcode:2016JNEng..13e1001F. doi: 10.1088/1741-2560/13/5/051001 . PMID   27529195.
  12. Farquhar, Ethan; Hasler, Paul. (May 2006). "A Field Programmable Neural Array". 2006 IEEE International Symposium on Circuits and Systems. pp. 4114–4117. doi:10.1109/ISCAS.2006.1693534. ISBN   978-0-7803-9389-9. S2CID   206966013.
  13. "MIT creates "brain chip"" . Retrieved December 4, 2012.
  14. Poon, Chi-Sang; Zhou, Kuan (2011). "Neuromorphic silicon neurons and large-scale neural networks: challenges and opportunities". Frontiers in Neuroscience. 5: 108. doi: 10.3389/fnins.2011.00108 . PMC   3181466 . PMID   21991244.
  15. Sharad, Mrigank; Augustine, Charles; Panagopoulos, Georgios; Roy, Kaushik (2012). "Proposal For Neuromorphic Hardware Using Spin Devices". arXiv: 1206.3227 [cond-mat.dis-nn].
  16. 1 2 Pickett, M. D.; Medeiros-Ribeiro, G.; Williams, R. S. (2012). "A scalable neuristor built with Mott memristors". Nature Materials. 12 (2): 114–7. Bibcode:2013NatMa..12..114P. doi:10.1038/nmat3510. PMID   23241533. S2CID   16271627.
  17. Matthew D Pickett & R Stanley Williams (September 2013). "Phase transitions enable computational universality in neuristor-based cellular automata". Nanotechnology. 24 (38). IOP Publishing Ltd. 384002. Bibcode:2013Nanot..24L4002P. doi:10.1088/0957-4484/24/38/384002. PMID   23999059. S2CID   9910142.
  18. Boahen, Kwabena (April 24, 2014). "Neurogrid: A Mixed-Analog-Digital Multichip System for Large-Scale Neural Simulations". Proceedings of the IEEE. 102 (5): 699–716. doi:10.1109/JPROC.2014.2313565. S2CID   17176371.
  19. Waldrop, M. Mitchell (2013). "Neuroelectronics: Smart connections". Nature. 503 (7474): 22–4. Bibcode:2013Natur.503...22W. doi: 10.1038/503022a . PMID   24201264.
  20. Benjamin, Ben Varkey; Peiran Gao; McQuinn, Emmett; Choudhary, Swadesh; Chandrasekaran, Anand R.; Bussat, Jean-Marie; Alvarez-Icaza, Rodrigo; Arthur, John V.; Merolla, Paul A.; Boahen, Kwabena (2014). "Neurogrid: A Mixed-Analog-Digital Multichip System for Large-Scale Neural Simulations". Proceedings of the IEEE. 102 (5): 699–716. doi:10.1109/JPROC.2014.2313565. S2CID   17176371.
  21. "Involved Organizations". Archived from the original on March 2, 2013. Retrieved February 22, 2013.
  22. "Human Brain Project" . Retrieved February 22, 2013.
  23. "The Human Brain Project and Recruiting More Cyberwarriors". January 29, 2013. Retrieved February 22, 2013.
  24. Neuromorphic computing: The machine of a new soul, The Economist, 2013-08-03
  25. Modha, Dharmendra (August 2014). "A million spiking-neuron integrated circuit with a scalable communication network and interface". Science. 345 (6197): 668–673. Bibcode:2014Sci...345..668M. doi:10.1126/science.1254642. PMID   25104385. S2CID   12706847.
  26. Fairfield, Jessamyn (March 1, 2017). "Smarter Machines" (PDF).
  27. Spagnolo, Michele; Morris, Joshua; Piacentini, Simone; Antesberger, Michael; Massa, Francesco; Crespi, Andrea; Ceccarelli, Francesco; Osellame, Roberto; Walther, Philip (April 2022). "Experimental photonic quantum memristor". Nature Photonics. 16 (4): 318–323. arXiv: 2105.04867 . Bibcode:2022NaPho..16..318S. doi:10.1038/s41566-022-00973-5. ISSN   1749-4893. S2CID   234358015.
    News article: "Erster "Quanten-Memristor" soll KI und Quantencomputer verbinden". DER STANDARD (in Austrian German). Retrieved April 28, 2022.
    Lay summary report: "Artificial neurons go quantum with photonic circuits". University of Vienna . Retrieved April 19, 2022.
  28. "'Artificial synapse' could make neural networks work more like brains". New Scientist. Retrieved August 21, 2022.
  29. Onen, Murat; Emond, Nicolas; Wang, Baoming; Zhang, Difei; Ross, Frances M.; Li, Ju; Yildiz, Bilge; del Alamo, Jesús A. (July 29, 2022). "Nanosecond protonic programmable resistors for analog deep learning" (PDF). Science. 377 (6605): 539–543. Bibcode:2022Sci...377..539O. doi:10.1126/science.abp8064. ISSN   0036-8075. PMID   35901152. S2CID   251159631.
  30. Davies, Mike; et al. (January 16, 2018). "Loihi: A Neuromorphic Manycore Processor with On-Chip Learning". IEEE Micro. 38 (1): 82–99. doi:10.1109/MM.2018.112130359. S2CID   3608458.
  31. Morris, John. "Why Intel built a neuromorphic chip". ZDNet. Retrieved August 17, 2018.
  32. "Imec demonstrates self-learning neuromorphic chip that composes music". IMEC International. Retrieved October 1, 2019.
  33. Bourzac, Katherine (May 23, 2017). "A Neuromorphic Chip That Makes Music". IEEE Spectrum. Retrieved October 1, 2019.
  34. "Beyond von Neumann, Neuromorphic Computing Steadily Advances". HPCwire. March 21, 2016. Retrieved October 8, 2021.
  35. "Neuromrophic Quantum Computing | Quromorphic Project | Fact Sheet | H2020". CORDIS | European Commission. doi:10.3030/828826 . Retrieved March 18, 2024.
  36. Pehle, Christian; Wetterich, Christof (March 30, 2021), Neuromorphic quantum computing, arXiv: 2005.01533 , retrieved March 18, 2024
  37. Wetterich, C. (November 1, 2019). "Quantum computing with classical bits". Nuclear Physics B. 948: 114776. arXiv: 1806.05960 . doi:10.1016/j.nuclphysb.2019.114776. ISSN   0550-3213.
  38. Pehle, Christian; Meier, Karlheinz; Oberthaler, Markus; Wetterich, Christof (October 24, 2018), Emulating quantum computation with artificial neural networks, arXiv: 1810.10335 , retrieved March 18, 2024
  39. Carleo, Giuseppe; Troyer, Matthias (February 10, 2017). "Solving the quantum many-body problem with artificial neural networks". Science. 355 (6325): 602–606. arXiv: 1606.02318 . doi:10.1126/science.aag2302. ISSN   0036-8075. PMID   28183973.
  40. Torlai, Giacomo; Mazzola, Guglielmo; Carrasquilla, Juan; Troyer, Matthias; Melko, Roger; Carleo, Giuseppe (May 2018). "Neural-network quantum state tomography". Nature Physics. 14 (5): 447–450. arXiv: 1703.05334 . doi:10.1038/s41567-018-0048-5. ISSN   1745-2481.
  41. Sharir, Or; Levine, Yoav; Wies, Noam; Carleo, Giuseppe; Shashua, Amnon (January 16, 2020). "Deep Autoregressive Models for the Efficient Variational Simulation of Many-Body Quantum Systems". Physical Review Letters. 124 (2): 020503. arXiv: 1902.04057 . doi:10.1103/PhysRevLett.124.020503. PMID   32004039.
  42. Broughton, Michael; Verdon, Guillaume; McCourt, Trevor; Martinez, Antonio J.; Yoo, Jae Hyeon; Isakov, Sergei V.; Massey, Philip; Halavati, Ramin; Niu, Murphy Yuezhen (August 26, 2021), TensorFlow Quantum: A Software Framework for Quantum Machine Learning, arXiv: 2003.02989 , retrieved March 18, 2024
  43. 1 2 Di Ventra, Massimiliano (March 23, 2022), MemComputing vs. Quantum Computing: some analogies and major differences, arXiv: 2203.12031 , retrieved March 18, 2024
  44. Wilkinson, Samuel A.; Hartmann, Michael J. (June 8, 2020). "Superconducting quantum many-body circuits for quantum simulation and computing". Applied Physics Letters. 116 (23). arXiv: 2003.08838 . doi:10.1063/5.0008202. ISSN   0003-6951.
  45. "Taking Orders of Akida AI Processor Development Kits". October 21, 2021.
  46. "First mini PCIexpress board with spiking neural network chip". January 19, 2022.
  47. "002.08 N.I.C.E. Workshop 2014: Towards Intelligent Computing with Neuromemristive Circuits and Systems – Feb. 2014". digitalops.sandia.gov. Retrieved August 26, 2019.
  48. C. Merkel and D. Kudithipudi, "Neuromemristive extreme learning machines for pattern classification," ISVLSI, 2014.
  49. Maan, A.K.; James, A.P.; Dimitrijev, S. (2015). "Memristor pattern recogniser: isolated speech word recognition". Electronics Letters. 51 (17): 1370–1372. Bibcode:2015ElL....51.1370M. doi:10.1049/el.2015.1428. hdl: 10072/140989 . S2CID   61454815.
  50. Maan, Akshay Kumar; Kumar, Dinesh S.; James, Alex Pappachen (January 1, 2014). "Memristive Threshold Logic Face Recognition". Procedia Computer Science. 5th Annual International Conference on Biologically Inspired Cognitive Architectures, 2014 BICA. 41: 98–103. doi: 10.1016/j.procs.2014.11.090 . hdl: 10072/68372 .
  51. Maan, A.K.; Kumar, D.S.; Sugathan, S.; James, A.P. (October 1, 2015). "Memristive Threshold Logic Circuit Design of Fast Moving Object Detection". IEEE Transactions on Very Large Scale Integration (VLSI) Systems. 23 (10): 2337–2341. arXiv: 1410.1267 . doi:10.1109/TVLSI.2014.2359801. ISSN   1063-8210. S2CID   9647290.
  52. James, A.P.; Francis, L.R.V.J.; Kumar, D.S. (January 1, 2014). "Resistive Threshold Logic". IEEE Transactions on Very Large Scale Integration (VLSI) Systems. 22 (1): 190–195. arXiv: 1308.0090 . doi:10.1109/TVLSI.2012.2232946. ISSN   1063-8210. S2CID   7357110.
  53. James, A.P.; Kumar, D.S.; Ajayan, A. (November 1, 2015). "Threshold Logic Computing: Memristive-CMOS Circuits for Fast Fourier Transform and Vedic Multiplication". IEEE Transactions on Very Large Scale Integration (VLSI) Systems. 23 (11): 2690–2694. arXiv: 1411.5255 . doi:10.1109/TVLSI.2014.2371857. ISSN   1063-8210. S2CID   6076956.
  54. Caravelli; et al. (2017). "The complex dynamics of memristive circuits: analytical results and universal slow relaxation". Physical Review E. 95 (2): 022140. arXiv: 1608.08651 . Bibcode:2017PhRvE..95b2140C. doi:10.1103/PhysRevE.95.022140. PMID   28297937. S2CID   6758362.
  55. 1 2 Caravelli; et al. (2021). "Global minimization via classical tunneling assisted by collective force field formation". Science Advances. 7 (52): 022140. arXiv: 1608.08651 . Bibcode:2021SciA....7.1542C. doi:10.1126/sciadv.abh1542. PMID   28297937. S2CID   231847346.
  56. Abraham, Isaac (July 20, 2018). "The case for rejecting the memristor as a fundamental circuit element". Scientific Reports. 8 (1): 10972. Bibcode:2018NatSR...810972A. doi:10.1038/s41598-018-29394-7. ISSN   2045-2322. PMC   6054652 . PMID   30030498.
  57. Sheldon, Forrest (2018). Collective Phenomena in Memristive Networks: Engineering phase transitions into computation. UC San Diego Electronic Theses and Dissertations.
  58. Skorka, Orit (July 1, 2011). "Toward a digital camera to rival the human eye". Journal of Electronic Imaging. 20 (3): 033009. doi:10.1117/1.3611015. ISSN   1017-9909.
  59. Sarkar, Tanmoy; Lieberth, Katharina; Pavlou, Aristea; Frank, Thomas; Mailaender, Volker; McCulloch, Iain; Blom, Paul W. M.; Torriccelli, Fabrizio; Gkoupidenis, Paschalis (November 7, 2022). "An organic artificial spiking neuron for in situ neuromorphic sensing and biointerfacing". Nature Electronics. 5 (11): 774–783. doi: 10.1038/s41928-022-00859-y . hdl: 10754/686016 . ISSN   2520-1131. S2CID   253413801.
  60. "Artificial neurons emulate biological counterparts to enable synergetic operation". Nature Electronics. 5 (11): 721–722. November 10, 2022. doi:10.1038/s41928-022-00862-3. ISSN   2520-1131. S2CID   253469402.
  61. 2015 Study Panel (September 2016). Artificial Intelligence and Life in 2030 (PDF). One Hundred Year Study on Artificial Intelligence (AI100) (Report). Stanford University. Archived from the original (PDF) on May 30, 2019. Retrieved December 26, 2019.{{cite report}}: CS1 maint: numeric names: authors list (link)
  62. 1 2 European Commission (September 2012). "Special Eurobarometer 382: Public Attitudes Towards Robots" (PDF). European Commission.
  63. 1 2 Lim, Daniel (June 1, 2014). "Brain simulation and personhood: a concern with the Human Brain Project". Ethics and Information Technology. 16 (2): 77–89. doi:10.1007/s10676-013-9330-5. ISSN   1572-8439. S2CID   17415814.
  64. Lavan. "Copyright in source code and digital products". Lavan. Retrieved May 10, 2019.
  65. Eshraghian, Jason K. (March 9, 2020). "Human Ownership of Artificial Creativity". Nature Machine Intelligence. 2 (3): 157–160. doi:10.1038/s42256-020-0161-x. S2CID   215991449.