Bart Andrew Kosko | |
---|---|
Born | Kansas City, Kansas | February 7, 1960
Occupation | Writer and Professor of Electrical Engineering |
Notable works | Fuzzy Thinking Nanotime Noise |
Bart Andrew Kosko (born February 7, 1960) is a writer and professor of electrical engineering and law at the University of Southern California (USC). He is a researcher and popularizer of fuzzy logic, neural networks, and noise, and the author of several trade books and textbooks on these and related subjects of machine intelligence. He was awarded the 2022 Donald O. Hebb Award for neural learning by the International Neural Network Society. [1] [2]
Kosko holds bachelor's degrees in philosophy and in economics from USC (1982), a master's degree in applied mathematics from UC San Diego (1983), a PhD in electrical engineering from UC Irvine (1987) under Allen Stubberud, [3] and a J.D. from Concord Law School. He is an attorney licensed in California and federal court, and worked part-time as a law clerk for the Los Angeles District Attorney's Office.
Kosko is a political and religious skeptic. He is a contributing editor of the libertarian periodical Liberty , where he has published essays on "Palestinian vouchers". [4]
Kosko's most popular book to date was the international best-seller Fuzzy Thinking, about man and machines thinking in shades of gray, and his most recent book was Noise. He has also published short fiction and the cyber-thriller novel Nanotime, about a possible World War III that takes place in two days of the year 2030. The novel's title coins the term "nanotime" to describe the time speed-up that occurs when fast computer chips, rather than slow brains, house minds.
Kosko has a minimalist prose style, not even using commas in his book Noise. [5]
Kosko's technical contributions have been in three main areas: fuzzy logic, neural networks, and noise.
In fuzzy logic, he introduced fuzzy cognitive maps, [6] [7] fuzzy subsethood, [8] additive fuzzy systems, [9] fuzzy approximation theorems, [10] optimal fuzzy rules, [11] fuzzy associative memories, various neural-based adaptive fuzzy systems, [9] ratio measures of fuzziness, [8] the shape of fuzzy sets, [12] the conditional variance of fuzzy systems, [13] and the geometric view of (finite) fuzzy sets as points in hypercubes and its relationship to the ongoing debate of fuzziness versus probability.
In neural networks, Kosko introduced the unsupervised technique of differential Hebbian learning, [14] sometimes called the "differential synapse," and most famously the BAM or bidirectional associative memory [15] family of feedback neural architectures, with corresponding global stability theorems. [14]
In noise, Kosko introduced the concept of adaptive stochastic resonance, [16] using neural-like learning algorithms to find the optimal level of noise to add to many nonlinear systems to improve their performance. He proved many versions of the so-called "forbidden interval theorem," which guarantees that noise will benefit a system if the average level of noise does not fall in an interval of values. [17] He also showed that noise can speed up the convergence of Markov chains to equilibrium. [18]
Fuzzy logic is a form of many-valued logic in which the truth value of variables may be any real number between 0 and 1. It is employed to handle the concept of partial truth, where the truth value may range between completely true and completely false. By contrast, in Boolean logic, the truth values of variables may only be the integer values 0 or 1.
In computational science, particle swarm optimization (PSO) is a computational method that optimizes a problem by iteratively trying to improve a candidate solution with regard to a given measure of quality. It solves a problem by having a population of candidate solutions, here dubbed particles, and moving these particles around in the search-space according to simple mathematical formulae over the particle's position and velocity. Each particle's movement is influenced by its local best known position, but is also guided toward the best known positions in the search-space, which are updated as better positions are found by other particles. This is expected to move the swarm toward the best solutions.
Stochastic resonance (SR) is a phenomenon in which a signal that is normally too weak to be detected by a sensor can be boosted by adding white noise to the signal, which contains a wide spectrum of frequencies. The frequencies in the white noise corresponding to the original signal's frequencies will resonate with each other, amplifying the original signal while not amplifying the rest of the white noise – thereby increasing the signal-to-noise ratio, which makes the original signal more prominent. Further, the added white noise can be enough to be detectable by the sensor, which can then filter it out to effectively detect the original, previously undetectable signal.
The expression computational intelligence (CI) usually refers to the ability of a computer to learn a specific task from data or experimental observation. Even though it is commonly considered a synonym of soft computing, there is still no commonly accepted definition of computational intelligence.
A recurrent neural network (RNN) is one of the two broad types of artificial neural network, characterized by direction of the flow of information between its layers. In contrast to the other type, the uni-directional feedforward neural network, it is a bi-directional artificial neural network, meaning that it allows the output from some nodes to affect subsequent input to the same nodes. Their ability to use internal state (memory) to process arbitrary sequences of inputs makes them applicable to tasks such as unsegmented, connected handwriting recognition or speech recognition. The term "recurrent neural network" is used to refer to the class of networks with an infinite impulse response, whereas "convolutional neural network" refers to the class of finite impulse response. Both classes of networks exhibit temporal dynamic behavior. A finite impulse recurrent network is a directed acyclic graph that can be unrolled and replaced with a strictly feedforward neural network, while an infinite impulse recurrent network is a directed cyclic graph that cannot be unrolled.
In the field of artificial intelligence, the designation neuro-fuzzy refers to combinations of artificial neural networks and fuzzy logic.
A memetic algorithm (MA) in computer science and operations research, is an extension of the traditional genetic algorithm (GA) or more general evolutionary algorithm (EA). It may provide a sufficiently good solution to an optimization problem. It uses a suitable heuristic or local search technique to improve the quality of solutions generated by the EA and to reduce the likelihood of premature convergence.
Dr. Lawrence Jerome Fogel was a pioneer in evolutionary computation and human factors analysis. He is known as the inventor of active noise cancellation and the father of evolutionary programming. His scientific career spanned nearly six decades and included electrical engineering, aerospace engineering, communication theory, human factors research, information processing, cybernetics, biotechnology, artificial intelligence, and computer science.
Computational neurogenetic modeling (CNGM) is concerned with the study and development of dynamic neuronal models for modeling brain functions with respect to genes and dynamic interactions between genes. These include neural network models and their integration with gene network models. This area brings together knowledge from various scientific disciplines, such as computer and information science, neuroscience and cognitive science, genetics and molecular biology, as well as engineering.
A fuzzy cognitive map (FCM) is a cognitive map within which the relations between the elements of a "mental landscape" can be used to compute the "strength of impact" of these elements. Fuzzy cognitive maps were introduced by Bart Kosko. Robert Axelrod introduced cognitive maps as a formal way of representing social scientific knowledge and modeling decision making in social and political systems, then brought in the computation.
Bidirectional associative memory (BAM) is a type of recurrent neural network. BAM was introduced by Bart Kosko in 1988. There are two types of associative memory, auto-associative and hetero-associative. BAM is hetero-associative, meaning given a pattern it can return another pattern which is potentially of a different size. It is similar to the Hopfield network in that they are both forms of associative memory. However, Hopfield nets return patterns of the same size.
The IEEE Systems, Man, and Cybernetics Society is a professional society of the IEEE. It aims "to serve the interests of its members and the community at large by promoting the theory, practice, and interdisciplinary aspects of systems science and engineering, human-machine systems, and cybernetics".
In computer science, an evolving intelligent system is a fuzzy logic system which improves the own performance by evolving rules. The technique is known from machine learning, in which external patterns are learned by an algorithm. Fuzzy logic based machine learning works with neuro-fuzzy systems.
An adaptive neuro-fuzzy inference system or adaptive network-based fuzzy inference system (ANFIS) is a kind of artificial neural network that is based on Takagi–Sugeno fuzzy inference system. The technique was developed in the early 1990s. Since it integrates both neural networks and fuzzy logic principles, it has potential to capture the benefits of both in a single framework.
An artificial neural network's learning rule or learning process is a method, mathematical logic or algorithm which improves the network's performance and/or training time. Usually, this rule is applied repeatedly over the network. It is done by updating the weights and bias levels of a network when a network is simulated in a specific data environment. A learning rule may accept existing conditions of the network and will compare the expected result and actual result of the network to give new and improved values for weights and bias. Depending on the complexity of actual model being simulated, the learning rule of the network can be as simple as an XOR gate or mean squared error, or as complex as the result of a system of differential equations.
Fusion adaptive resonance theory (fusion ART) is a generalization of self-organizing neural networks known as the original Adaptive Resonance Theory models for learning recognition categories across multiple pattern channels. There is a separate stream of work on fusion ARTMAP, that extends fuzzy ARTMAP consisting of two fuzzy ART modules connected by an inter-ART map field to an extended architecture consisting of multiple ART modules.
This glossary of artificial intelligence is a list of definitions of terms and concepts relevant to the study of artificial intelligence, its sub-disciplines, and related fields. Related glossaries include Glossary of computer science, Glossary of robotics, and Glossary of machine vision.
Javier Andreu-Perez is a British computer scientist and a Senior Lecturer and Chair in Smart Health Technologies at the University of Essex. He is also associate editor-in-chief of Neurocomputing for the area of Deep Learning and Machine Learning. Andreu-Perez research is mainly focused on Human-Centered Artificial Intelligence (HCAI). He also chairs a interdisciplinary lab in this area, HCAI-Essex.
Fuzzy differential inclusion is the extension of differential inclusion to fuzzy sets introduced by Lotfi A. Zadeh.
Jerry M. Mendel is an engineer, academic, and author. He is professor emeritus of Electrical and Computer Engineering at the University of Southern California.