Clark Glymour | |
---|---|
Born | 1942 |
Academic background | |
Alma mater | University of New Mexico Indiana University Bloomington (Ph.D., 1969) |
Academic work | |
Institutions | Carnegie Mellon University |
Clark N. Glymour (born 1942) is the Alumni University Professor Emeritus in the Department of Philosophy at Carnegie Mellon University. He is also a senior research scientist at the Florida Institute for Human and Machine Cognition. [1]
Glymour earned undergraduate degrees in chemistry and philosophy at the University of New Mexico. He did graduate work in chemical physics and obtained a Ph.D. in History and Philosophy of Science from Indiana University Bloomington in 1969. [2]
Glymour is the founder of the Philosophy Department at Carnegie Mellon University,a Guggenheim Fellow,a Fellow of the Center for Advanced Study in Behavioral Sciences, [3] a Phi Beta Kappa lecturer, [4] and is a Fellow of the statistics section of the AAAS. [5] Glymour and his collaborators created the causal interpretation of Bayes nets. [6] His areas of interest include epistemology [7] (particularly Android epistemology),machine learning,automated reasoning,psychology of judgment,and mathematical psychology. [8] One of Glymour's main contributions to the philosophy of science is in the area of Bayesian probability,particularly in his analysis of the Bayesian "problem of old evidence". [9] [10] Glymour,in collaboration with Peter Spirtes and Richard Scheines,also developed an automated causal inference algorithm implemented as software named TETRAD. [11] Using multivariate statistical data as input,TETRAD rapidly searches from among all possible causal relationship models and returns the most plausible causal models based on conditional dependence relationships between those variables. The algorithm is based on principles from statistics,graph theory,philosophy of science,and artificial intelligence. [12] An algorithm used in learning the structure of Bayesian networks,the PC algorithm,is named after the inventors' first names,Peter Spirtes and Clark Glymour.
Bayesian probability is an interpretation of the concept of probability, in which, instead of frequency or propensity of some phenomenon, probability is interpreted as reasonable expectation representing a state of knowledge or as quantification of a personal belief.
Epistemology is the branch of philosophy concerned with knowledge. It studies the nature, origin, and scope of knowledge, epistemic justification, the rationality of belief, and various related issues. Debates in contemporary epistemology are generally clustered around four core areas:
Causality is an influence by which one event, process, state, or object (acause) contributes to the production of another event, process, state, or object (an effect) where the cause is partly responsible for the effect, and the effect is partly dependent on the cause. In general, a process has many causes, which are also said to be causal factors for it, and all lie in its past. An effect can in turn be a cause of, or causal factor for, many other effects, which all lie in its future. Some writers have held that causality is metaphysically prior to notions of time and space.
Scientific evidence is evidence that serves to either support or counter a scientific theory or hypothesis, although scientists also use evidence in other ways, such as when applying theories to practical problems. Such evidence is expected to be empirical evidence and interpretable in accordance with the scientific method. Standards for scientific evidence vary according to the field of inquiry, but the strength of scientific evidence is generally based on the results of statistical analysis and the strength of scientific controls.
A Bayesian network is a probabilistic graphical model that represents a set of variables and their conditional dependencies via a directed acyclic graph (DAG). While it is one of several forms of causal notation, causal networks are special cases of Bayesian networks. Bayesian networks are ideal for taking an event that occurred and predicting the likelihood that any one of several possible known causes was the contributing factor. For example, a Bayesian network could represent the probabilistic relationships between diseases and symptoms. Given symptoms, the network can be used to compute the probabilities of the presence of various diseases.
Understanding is a cognitive process related to an abstract or physical object, such as a person, situation, or message whereby one is able to use concepts to model that object. Understanding is a relation between the knower and an object of understanding. Understanding implies abilities and dispositions with respect to an object of knowledge that are sufficient to support intelligent behavior.
Minimum Description Length (MDL) is a model selection principle where the shortest description of the data is the best model. MDL methods learn through a data compression perspective and are sometimes described as mathematical applications of Occam's razor. The MDL principle can be extended to other forms of inductive inference and learning, for example to estimation and sequential prediction, without explicitly identifying a single model of the data.
A graphical model or probabilistic graphical model (PGM) or structured probabilistic model is a probabilistic model for which a graph expresses the conditional dependence structure between random variables. They are commonly used in probability theory, statistics—particularly Bayesian statistics—and machine learning.
Trygve Magnus Haavelmo, born in Skedsmo, Norway, was an economist whose research interests centered on econometrics. He received the Nobel Memorial Prize in Economic Sciences in 1989.
Computational epistemology is a subdiscipline of formal epistemology that studies the intrinsic complexity of inductive problems for ideal and computationally bounded agents. In short, computational epistemology is to induction what recursion theory is to deduction. It has been applied to problems in philosophy of science.
Formal epistemology uses formal methods from decision theory, logic, probability theory and computability theory to model and reason about issues of epistemological interest. Work in this area spans several academic fields, including philosophy, computer science, economics, and statistics. The focus of formal epistemology has tended to differ somewhat from that of traditional epistemology, with topics like uncertainty, induction, and belief revision garnering more attention than the analysis of knowledge, skepticism, and issues with justification.
Android epistemology is an approach to epistemology considering the space of possible machines and their capacities for knowledge, beliefs, attitudes, desires and for action in accord with their mental states. Thus, android epistemology incorporates artificial intelligence, computational cognitive psychology, computability theory and other related disciplines.
Sandra D. Mitchell is an American philosopher of science and historian of ideas. She holds the position of distinguished professor in the department of History and Philosophy of Science at the University of Pittsburgh, the top rated school in the world for the subject according to the 2011 Philosophical Gourmet Report. Her research focuses on the philosophy of biology and the philosophy of social science, and connections between the two.
Alison Gopnik is an American professor of psychology and affiliate professor of philosophy at the University of California, Berkeley. She is known for her work in the areas of cognitive and language development, specializing in the effect of language on thought, the development of a theory of mind, and causal learning. Her writing on psychology and cognitive science has appeared in Science, Scientific American, The Times Literary Supplement, The New York Review of Books, The New York Times, New Scientist, Slate and others. Her body of work also includes four books and over 100 journal articles.
Causal analysis is the field of experimental design and statistics pertaining to establishing cause and effect. Typically it involves establishing four elements: correlation, sequence in time, a plausible physical or information-theoretical mechanism for an observed effect to follow from a possible cause, and eliminating the possibility of common and alternative ("special") causes. Such analysis usually involves one or more artificial or natural experiments.
Wolfgang Konrad Spohn is a German philosopher. He is professor of philosophy and philosophy of science at the University of Konstanz.
Richard Eugene Neapolitan was an American scientist. Neapolitan is most well-known for his role in establishing the use of probability theory in artificial intelligence and in the development of the field Bayesian networks.
Causal analysis is the field of experimental design and statistical analysis pertaining to establishing cause and effect. Exploratory causal analysis (ECA), also known as data causality or causal discovery is the use of statistical algorithms to infer associations in observed data sets that are potentially causal under strict assumptions. ECA is a type of causal inference distinct from causal modeling and treatment effects in randomized controlled trials. It is exploratory research usually preceding more formal causal research in the same way exploratory data analysis often precedes statistical hypothesis testing in data analysis
Fredrick Eberhardt is an American philosopher and professor of philosophy at the California Institute of Technology. Previously he was a faculty member in the Philosophy-Neuroscience-Psychology program at Washington University in St. Louis. Eberhardt is known for his works on philosophy of science.