Michael Anthony Arbib (born May 28, 1940) is an American computational neuroscientist. He is an Adjunct Professor of Psychology at the University of California at San Diego and professor emeritus at the University of Southern California; before his 2016 retirement he was the Fletcher Jones Professor of computer science, as well as a professor of biological sciences, [1] biomedical engineering, [1] electrical engineering, [2] neuroscience and psychology. [1]
Arbib was born in England on May 28, 1940, the oldest of four children. His parents moved to New Zealand when he was about 7, and on to Australia when he was about 9. [3] Arbib was educated in New Zealand and at The Scots College in Sydney, Australia.[ citation needed ] In 1960 he took a BSc (Hons) at the University of Sydney, [2] with the University Medal in Pure Mathematics.[ citation needed ]
Arbib received his PhD in Mathematics from the Massachusetts Institute of Technology in 1963. [4] He was advised by Norbert Wiener, the founder of cybernetics, and Henry McKean. [3] [4] As a student, he also worked with Warren McCulloch, the co-inventor of the artificial neural network and finite-state machine. [3]
Following his PhD, Arbib moved to Stanford for a postdoc with Rudolf E. Kálmán. [3] [5] Arbib spent five years at Stanford, before moving to become becoming the founding chairman of the Department of Computer and Information Science at the University of Massachusetts Amherst in 1970. [5] He remained in the Department until 1986, when he joined the University of Southern California. [5] He retired and was granted emeritus status in 2016. [6]
Arbib's collected papers from the period 1960 through 1985 are held by the University of Massachusetts Amherst. [7]
This section may require cleanup to meet Wikipedia's quality standards. The specific problem is: this long list specifies inconsistent information about the books, many are lacking isbn, edited volumes are mixed in with authored books, etc.(October 2020) |
In machine learning, a neural network is a model inspired by the structure and function of biological neural networks in animal brains.
Artificial consciousness (AC), also known as machine consciousness (MC), synthetic consciousness or digital consciousness, is the consciousness hypothesized to be possible in artificial intelligence. It is also the corresponding field of study, which draws insights from philosophy of mind, philosophy of artificial intelligence, cognitive science and neuroscience. The same terminology can be used with the term "sentience" instead of "consciousness" when specifically designating phenomenal consciousness.
Computational neuroscience is a branch of neuroscience which employs mathematics, computer science, theoretical analysis and abstractions of the brain to understand the principles that govern the development, structure, physiology and cognitive abilities of the nervous system.
Theoretical computer science is a subfield of computer science and mathematics that focuses on the abstract and mathematical foundations of computation, such as the theory of computation, formal language theory, the lambda calculus and type theory.
Bio-inspired computing, short for biologically inspired computing, is a field of study which seeks to solve computer science problems using models of biology. It relates to connectionism, social behavior, and emergence. Within computer science, bio-inspired computing relates to artificial intelligence and machine learning. Bio-inspired computing is a major subset of natural computation.
Generative science is an area of research that explores the natural world and its complex behaviours. It explores ways "to generate apparently unanticipated and infinite behaviour based on deterministic and finite rules and parameters reproducing or resembling the behavior of natural and social phenomena". By modelling such interactions, it can suggest that properties exist in the system that had not been noticed in the real world situation. An example field of study is how unintended consequences arise in social processes.
The expression computational intelligence (CI) usually refers to the ability of a computer to learn a specific task from data or experimental observation. Even though it is commonly considered a synonym of soft computing, there is still no commonly accepted definition of computational intelligence.
An artificial brain is software and hardware with cognitive abilities similar to those of the animal or human brain.
Computational cognition is the study of the computational basis of learning and inference by mathematical modeling, computer simulation, and behavioral experiments. In psychology, it is an approach which develops computational models based on experimental results. It seeks to understand the basis behind the human method of processing of information. Early on computational cognitive scientists sought to bring back and create a scientific form of Brentano's psychology.
Neuroinformatics is the emergent field that combines informatics and neuroscience. Neuroinformatics is related with neuroscience data and information processing by artificial neural networks. There are three main directions where neuroinformatics has to be applied:
In philosophy of mind, the computational theory of mind (CTM), also known as computationalism, is a family of views that hold that the human mind is an information processing system and that cognition and consciousness together are a form of computation. Warren McCulloch and Walter Pitts (1943) were the first to suggest that neural activity is computational. They argued that neural computations explain cognition. The theory was proposed in its modern form by Hilary Putnam in 1967, and developed by his PhD student, philosopher, and cognitive scientist Jerry Fodor in the 1960s, 1970s, and 1980s. It was vigorously disputed in analytic philosophy in the 1990s due to work by Putnam himself, John Searle, and others.
Andrew G. Barto is an American computer scientist, currently Professor Emeritus of computer science at University of Massachusetts Amherst. Barto is best known for his foundational contributions to the field of modern computational reinforcement learning.
The following outline is provided as an overview of and topical guide to artificial intelligence:
Pierre Baldi is a distinguished professor of computer science at University of California Irvine and the director of its Institute for Genomics and Bioinformatics.
Informatics is the study of computational systems. According to the ACM Europe Council and Informatics Europe, informatics is synonymous with computer science and computing as a profession, in which the central notion is transformation of information. In some cases, the term "informatics" may also be used with different meanings, e.g. in the context of social computing, or in context of library science.
This glossary of artificial intelligence is a list of definitions of terms and concepts relevant to the study of artificial intelligence, its sub-disciplines, and related fields. Related glossaries include Glossary of computer science, Glossary of robotics, and Glossary of machine vision.
Soft computing is an umbrella term used to describe types of algorithms that produce approximate solutions to unsolvable high-level problems in computer science. Typically, traditional hard-computing algorithms heavily rely on concrete data and mathematical models to produce solutions to problems. Soft computing was coined in the late 20th century. During this period, revolutionary research in three fields greatly impacted soft computing. Fuzzy logic is a computational paradigm that entertains the uncertainties in data by using levels of truth rather than rigid 0s and 1s in binary. Next, neural networks which are computational models influenced by human brain functions. Finally, evolutionary computation is a term to describe groups of algorithm that mimic natural processes such as evolution and natural selection.
Alice Cline Parker is an American electrical engineer. Her early research studied electronic design automation; later in her career, her interests shifted to neuromorphic engineering, biomimetic architecture for computer vision, analog circuits, carbon nanotube field-effect transistors, and nanotechnology. She is Dean's Professor of Electrical and Computer Engineering in the USC Viterbi School of Engineering of the University of Southern California.
A neural network is a group of interconnected units called neurons that send signals to one another. Neurons can be either biological cells or mathematical models. While individual neurons are simple, many of them together in a network can perform complex tasks. There are two main types of neural network.