George V. Cybenko is the Dorothy and Walter Gramm Professor of Engineering at Dartmouth and a fellow of the IEEE and SIAM. [1]
Cybenko obtained his BA in mathematics from the University of Toronto in 1974 and received his PhD from Princeton in applied mathematics of electrical and computer engineering in 1978 under Bede Liu. [2]
Cybenko served as an advisor for the Defense Science Board and the Air Force Scientific Advisory Board, among several other government panels. He was the founding editor-in-chief of Security & Privacy and also of Computing in Science & Engineering , both IEEE technical magazines. His current research interests are distributed information, control systems, and signal processing, with a focus on applications to security and infrastructure protection. He is known for proving the universal approximation theorem for artificial neural networks with sigmoid activation functions. [3]
Computer science is the study of the theoretical foundations of information and computation and their implementation and application in computer systems. One well known subject classification system for computer science is the ACM Computing Classification System devised by the Association for Computing Machinery.
Theoretical computer science is a subfield of computer science and mathematics that focuses on the abstract and mathematical foundations of computation, such as the theory of computation, formal language theory, the lambda calculus and type theory.
Bio-inspired computing, short for biologically inspired computing, is a field of study which seeks to solve computer science problems using models of biology. It relates to connectionism, social behavior, and emergence. Within computer science, bio-inspired computing relates to artificial intelligence and machine learning. Bio-inspired computing is a major subset of natural computation.
Hsiang-Tsung Kung is a Taiwanese-born American computer scientist. He is the William H. Gates professor of computer science at Harvard University. His early research in parallel computing produced the systolic array in 1979, which has since become a core computational component of hardware accelerators for artificial intelligence, including Google's Tensor Processing Unit (TPU). Similarly, he proposed optimistic concurrency control in 1981, now a key principle in memory and database transaction systems, including MySQL, Apache CouchDB, Google's App Engine, and Ruby on Rails. He remains an active researcher, with ongoing contributions to computational complexity theory, hardware design, parallel computing, routing, wireless communication, signal processing, and artificial intelligence.
The expression computational intelligence (CI) usually refers to the ability of a computer to learn a specific task from data or experimental observation. Even though it is commonly considered a synonym of soft computing, there is still no commonly accepted definition of computational intelligence.
In artificial intelligence, artificial immune systems (AIS) are a class of computationally intelligent, rule-based machine learning systems inspired by the principles and processes of the vertebrate immune system. The algorithms are typically modeled after the immune system's characteristics of learning and memory for use in problem-solving.
Vasant G. Honavar is an Indian-American computer scientist, and artificial intelligence, machine learning, big data, data science, causal inference, knowledge representation, bioinformatics and health informatics researcher and professor.
Eduardo Daniel Sontag is an Argentine-American mathematician, and distinguished university professor at Northeastern University, who works in the fields control theory, dynamical systems, systems molecular biology, cancer and immunology, theoretical computer science, neural networks, and computational biology.
Michael George Luby is a mathematician and computer scientist, CEO of BitRipple, senior research scientist at the International Computer Science Institute (ICSI), former VP Technology at Qualcomm, co-founder and former chief technology officer of Digital Fountain. In coding theory he is known for leading the invention of the Tornado codes and the LT codes. In cryptography he is known for his contributions showing that any one-way function can be used as the basis for private cryptography, and for his analysis, in collaboration with Charles Rackoff, of the Feistel cipher construction. His distributed algorithm to find a maximal independent set in a computer network has also been influential.
Martin Vetterli was president of École polytechnique fédérale de Lausanne (EPFL) in Switzerland, succeeding Patrick Aebischer. He's a professor of engineering and was formerly the president of the National Research Council of the Swiss National Science Foundation.
Ali Naci Akansu is a Turkish-American professor of electrical & computer engineering and scientist in applied mathematics.
Robert Jackson Marks II is an American electrical engineer, computer scientist and Distinguished Professor at Baylor University. His contributions include the Zhao-Atlas-Marks (ZAM) time-frequency distribution in the field of signal processing, the Cheung–Marks theorem in Shannon sampling theory and the Papoulis-Marks-Cheung (PMC) approach in multidimensional sampling. He was instrumental in the defining of the field of computational intelligence and co-edited the first book using computational intelligence in the title. A Christian and an old earth creationist, he is a subject of the 2008 pro-intelligent design motion picture, Expelled: No Intelligence Allowed.
The activation function of a node in an artificial neural network is a function that calculates the output of the node based on its individual inputs and their weights. Nontrivial problems can be solved using only a few nodes if the activation function is nonlinear. Modern activation functions include the smooth version of the ReLU, the GELU, which was used in the 2018 BERT model, the logistic (sigmoid) function used in the 2012 speech recognition model developed by Hinton et al, the ReLU used in the 2012 AlexNet computer vision model and in the 2015 ResNet model.
Artificial neural networks are combinations of multiple simple mathematical functions that implement more complicated functions from (typically) real-valued vectors to real-valued vectors. The spaces of multivariate functions that can be implemented by a network are determined by the structure of the network, the set of simple functions, and its multiplicative parameters. A great deal of theoretical work has gone into characterizing these function spaces.
Informatics is the study of computational systems. According to the ACM Europe Council and Informatics Europe, informatics is synonymous with computer science and computing as a profession, in which the central notion is transformation of information. In some cases, the term "informatics" may also be used with different meanings, e.g. in the context of social computing, or in context of library science.
Anil Kumar Jain is an Indian-American computer scientist and University Distinguished Professor in the Department of Computer Science & Engineering at Michigan State University, known for his contributions in the fields of pattern recognition, computer vision and biometric recognition. He is among the top few most highly cited researchers in computer science and has received various high honors and recognitions from institutions such as ACM, IEEE, AAAS, IAPR, SPIE, the U.S. National Academy of Engineering, the Indian National Academy of Engineering and the Chinese Academy of Sciences.
This glossary of artificial intelligence is a list of definitions of terms and concepts relevant to the study of artificial intelligence, its sub-disciplines, and related fields. Related glossaries include Glossary of computer science, Glossary of robotics, and Glossary of machine vision.