Adrian David Walker is a US Computer Scientist, born in London, England.
Adrian Walker attended Dartington Hall School, an experimental boarding school in England where attendance at classes was optional. He obtained a bachelor's degree in electrical engineering at Sheffield University (where he also chaired the Arts Society and edited a poetry magazine), and a master's degree in systems engineering [1] from the University of Surrey. He next obtained a PhD in computer science [2] from the State University of New York.
He was assistant professor at Rutgers university in New Jersey, [3] then Member of Technical Staff at Bell Labs. He moved to the IBM Almaden Research Center in California as a Research Staff Member [4] then to the IBM Thomas J. Watson Research Center in Yorktown NY as manager of Principles and Applications of Logic Programming. [5] [6] After 17 years at IBM, he formed his own company, where he works on Internet Business Logic, [7] a system for social knowledge acquisition and use in executable English.
Selected work: Walker's early work [8] [9] established a novel correspondence between stable patterns in formalised biological systems and the well known Chomsky hierarchy of languages—regular, context free, and context sensitive. He continued in grammar-based research [10] by showing how Bayes' theorem can be used to fit a stochastic regular grammar to a collection of data, a result that can be used to inductively infer hidden Markov models. Walker next showed [11] [12] [13] that, under certain practically useful assumptions, it is possible to compute the semantics of sets of syllogism-like rules in open vocabulary, largely open syntax English, in such a way as to answer English questions put to databases. This relaxes an onerous assumption made in many computational natural language understanding systems—namely, that the vocabulary must be narrowly restricted to obtain a useful level of understanding. A logical theory of knowledge developed in [14] [15] is applied in a system on the Web (reference 7, below) that combines three kinds of semantics – (a) data, as in SQL or Resource Description Framework, (b) inference, and (c) English, to answer questions over networked databases, and to explain the results in hypertexted English. The subject knowledge needed to do this (e.g. knowledge about the oil industry, or about energy independence, etc.) can be captured in social network style, by typing executable English into browsers. This contrasts with other social media, such as Twitter and Facebook, in which knowledge written in English is readable, but cannot be executed as a computer program.
Cyc is a long-term artificial intelligence project that aims to assemble a comprehensive ontology and knowledge base that spans the basic concepts and rules about how the world works. Hoping to capture common sense knowledge, Cyc focuses on implicit knowledge that other AI platforms may take for granted. This is contrasted with facts one might find somewhere on the internet or retrieve via a search engine or Wikipedia. Cyc enables semantic reasoners to perform human-like reasoning and be less "brittle" when confronted with novel situations.
In logic, mathematics, computer science, and linguistics, a formal language consists of words whose letters are taken from an alphabet and are well-formed according to a specific set of rules.
The following outline is provided as an overview and topical guide to linguistics:
A programming language is any set of rules that converts strings, or graphical program elements in the case of visual programming languages, to various kinds of machine code output. Programming languages are one kind of computer language, and are used in computer programming to implement algorithms.
In computer science and information science, an ontology encompasses a representation, formal naming, and definition of the categories, properties, and relations between the concepts, data, and entities that substantiate one, many, or all domains of discourse. More simply, an ontology is a way of showing the properties of a subject area and how they are related, by defining a set of concepts and categories that represent the subject.
Natural-language understanding (NLU) or natural-language interpretation (NLI) is a subtopic of natural-language processing in artificial intelligence that deals with machine reading comprehension. Natural-language understanding is considered an AI-hard problem.
In computer science, specifically software engineering and hardware engineering, formal methods are a particular kind of mathematically rigorous techniques for the specification, development and verification of software and hardware systems. The use of formal methods for software and hardware design is motivated by the expectation that, as in other engineering disciplines, performing appropriate mathematical analysis can contribute to the reliability and robustness of a design.
The denotation of a word is its central sense and the entire set of objects that can be contained in the word's meaning. Denotation is sometimes contrasted to connotation, which includes associated meanings and pragmatic inferences, because the denotational meaning of a word is perceived through visible concepts, whereas connotational meaning evokes sensible attitudes towards the phenomena. This concept is relevant in several fields, including linguistics, philosophy, and computer science. From a philosophical standpoint, exploration of meaning as it relates to denotation is important in the study of the philosophy of language.
A conceptual graph (CG) is a formalism for knowledge representation. In the first published paper on CGs, John F. Sowa used them to represent the conceptual schemas used in database systems. The first book on CGs applied them to a wide range of topics in artificial intelligence, computer science, and cognitive science.
A formal system is an abstract structure used for inferring theorems from axioms according to a set of rules. These rules, which are used for carrying out the inference of theorems from axioms, are the logical calculus of the formal system. A formal system is essentially an "axiomatic system".
Controlled natural languages (CNLs) are subsets of natural languages that are obtained by restricting the grammar and vocabulary in order to reduce or eliminate ambiguity and complexity. Traditionally, controlled languages fall into two major types: those that improve readability for human readers , and those that enable reliable automatic semantic analysis of the language.
Deborah Louise McGuinness is an American computer scientist and researcher at Rensselaer Polytechnic Institute (RPI). She is a Professor of Computer, Cognitive and Web Sciences, Industrial and Systems Engineering, and an endowed chair in the Tetherless World Constellation, a multidisciplinary research institution within RPI that focuses on the study of theories, methods and applications of the World Wide Web. Her fields of expertise include interdisciplinary data integration, artificial intelligence, specifically in knowledge representation and reasoning, description logics, the semantic web, explanation, and trust.
Programming language theory (PLT) is a branch of computer science that deals with the design, implementation, analysis, characterization, and classification of formal languages known as programming languages and of their individual features. It falls within the discipline of computer science, both depending on and affecting mathematics, software engineering, linguistics and even cognitive science. It has become a well-recognized branch of computer science, and an active research area, with results published in numerous journals dedicated to PLT, as well as in general computer science and engineering publications.
Grammar induction is the process in machine learning of learning a formal grammar from a set of observations, thus constructing a model which accounts for the characteristics of the observed objects. More generally, grammatical inference is that branch of machine learning where the instance space consists of discrete combinatorial objects such as strings, trees and graphs.
Donkey sentences are sentences that contain a pronoun with clear meaning but whose syntactical role in the sentence poses challenges to grammarians. Such sentences defy straightforward attempts to generate their formal language equivalents. The difficulty is with understanding how English speakers parse such sentences.
Ronald Fagin is an American mathematician and computer scientist, and IBM Fellow at the IBM Almaden Research Center. He is known for his work in database theory, finite model theory, and reasoning about knowledge.
In information technology a reasoning system is a software system that generates conclusions from available knowledge using logical techniques such as deduction and induction. Reasoning systems play an important role in the implementation of artificial intelligence and knowledge-based systems.
Logic is the study of correct reasoning or good arguments. It is often defined in a more narrow sense as the science of deductively valid inferences or of logical truths. In this sense, it is equivalent to formal logic and constitutes a formal science investigating how conclusions follow from premises in a topic-neutral way or which propositions are true only in virtue of the logical vocabulary they contain. When used as a countable noun, the term "a logic" refers to a logical formal system. Formal logic contrasts with informal logic, which is also part of logic when understood in the widest sense. There is no general agreement on how the two are to be distinguished. One prominent approach associates their difference with the study of arguments expressed in formal or informal languages. Another characterizes informal logic as the study of ampliative inferences, in contrast to the deductive inferences studied by formal logic. But it is also common to link their difference to the distinction between formal and informal fallacies.
This glossary of artificial intelligence is a list of definitions of terms and concepts relevant to the study of artificial intelligence, its sub-disciplines, and related fields. Related glossaries include Glossary of computer science, Glossary of robotics, and Glossary of machine vision.
Nissim Francez is an Israeli professor, emeritus in the computer science faculty at the Technion, and former head of computational linguistics laboratory in the faculty.
{{cite web}}
: CS1 maint: archived copy as title (link){{cite web}}
: CS1 maint: archived copy as title (link)