Gregory Wheeler

Last updated


Gregory Wheeler
Gregory wheeler.jpg
Born1968
Virginia, United States Flag of the United States.svg
Nationality American
Alma mater University of Rochester
Known forworks on imprecise probabilities
Scientific career
Fields Philosophy
Computer Science
Institutions Frankfurt School of Finance & Management
Ludwig Maximilian University
Carnegie Mellon University
Max Planck Institute for Human Development
New University of Lisbon
Doctoral advisor Henry E. Kyburg, Jr.

Gregory Wheeler (born 1968) is an American logician, philosopher, and computer scientist, who specializes in formal epistemology. [1] Much of his work has focused on imprecise probability. He is currently Professor of Philosophy and Computer Science at the Frankfurt School of Finance and Management, [2] and has held positions at LMU Munich, [3] Carnegie Mellon University, [4] the Max Planck Institute for Human Development in Berlin, [5] and the New University of Lisbon. [6] He is a member of the PROGIC [7] steering committee, the editorial boards of Synthese , [8] and Minds and Machines , and was the editor-in-chief of Minds and Machines from 2011 to 2016. In 2019 he co-founded Exaloan AG, a financial technology company based in Frankfurt. [9] He obtained a Ph.D. in philosophy and computer science from the University of Rochester under Henry Kyburg. [10]

Contents

Select bibliography

Books

Articles

Related Research Articles

Bayesian probability is an interpretation of the concept of probability, in which, instead of frequency or propensity of some phenomenon, probability is interpreted as reasonable expectation representing a state of knowledge or as quantification of a personal belief.

The word probability has been used in a variety of ways since it was first applied to the mathematical study of games of chance. Does probability measure the real, physical, tendency of something to occur, or is it a measure of how strongly one believes it will occur, or does it draw on both these elements? In answering such questions, mathematicians interpret the probability values of probability theory.

Scientific evidence is evidence that serves to either support or counter a scientific theory or hypothesis, although scientists also use evidence in other ways, such as when applying theories to practical problems. Such evidence is expected to be empirical evidence and interpretable in accordance with the scientific method. Standards for scientific evidence vary according to the field of inquiry, but the strength of scientific evidence is generally based on the results of statistical analysis and the strength of scientific controls.

A Bayesian network is a probabilistic graphical model that represents a set of variables and their conditional dependencies via a directed acyclic graph (DAG). While it is one of several forms of causal notation, causal networks are special cases of Bayesian networks. Bayesian networks are ideal for taking an event that occurred and predicting the likelihood that any one of several possible known causes was the contributing factor. For example, a Bayesian network could represent the probabilistic relationships between diseases and symptoms. Given symptoms, the network can be used to compute the probabilities of the presence of various diseases.

Computational epistemology is a subdiscipline of formal epistemology that studies the intrinsic complexity of inductive problems for ideal and computationally bounded agents. In short, computational epistemology is to induction what recursion theory is to deduction. It has been applied to problems in philosophy of science.

Imprecise probability generalizes probability theory to allow for partial probability specifications, and is applicable when information is scarce, vague, or conflicting, in which case a unique probability distribution may be hard to identify. Thereby, the theory aims to represent the available knowledge more accurately. Imprecision is useful for dealing with expert elicitation, because:

Formal epistemology uses formal methods from decision theory, logic, probability theory and computability theory to model and reason about issues of epistemological interest. Work in this area spans several academic fields, including philosophy, computer science, economics, and statistics. The focus of formal epistemology has tended to differ somewhat from that of traditional epistemology, with topics like uncertainty, induction, and belief revision garnering more attention than the analysis of knowledge, skepticism, and issues with justification.

Probabilistic logic involves the use of probability and logic to deal with uncertain situations. Probabilistic logic extends traditional logic truth tables with probabilistic expressions. A difficulty of probabilistic logics is their tendency to multiply the computational complexities of their probabilistic and logical components. Other difficulties include the possibility of counter-intuitive results, such as in case of belief fusion in Dempster–Shafer theory. Source trust and epistemic uncertainty about the probabilities they provide, such as defined in subjective logic, are additional elements to consider. The need to deal with a broad variety of contexts and issues has led to many different proposals.

The lottery paradox arises from Henry E. Kyburg Jr. considering a fair 1,000-ticket lottery that has exactly one winning ticket. If that much is known about the execution of the lottery, it is then rational to accept that some ticket will win.

Isaac Levi was an American philosopher who served as the John Dewey Professor of Philosophy at Columbia University. He is noted for his work in epistemology and decision theory.

Colin Howson was a British philosopher. He was Professor of Philosophy at the University of Toronto, where he joined the faculty on 1 July 2008. Previously, he was Professor of Logic at the London School of Economics. He completed a PhD on the philosophy of probability in 1981. In the late 1960s he had been a research assistant of Imre Lakatos at LSE. He died on Sunday 5 January 2020.

Henry E. Kyburg Jr. (1928–2007) was Gideon Burbank Professor of Moral Philosophy and Professor of Computer Science at the University of Rochester, New York, and Pace Eminent Scholar at the Institute for Human and Machine Cognition, Pensacola, Florida. His first faculty posts were at Rockefeller Institute, University of Denver, Wesleyan College, and Wayne State University.

Statistical relational learning (SRL) is a subdiscipline of artificial intelligence and machine learning that is concerned with domain models that exhibit both uncertainty and complex, relational structure. Typically, the knowledge representation formalisms developed in SRL use first-order logic to describe relational properties of a domain in a general manner and draw upon probabilistic graphical models to model the uncertainty; some also build upon the methods of inductive logic programming. Significant contributions to the field have been made since the late 1990s.

Donald Angus Gillies is a British philosopher and historian of science and mathematics. He is an Emeritus Professor in the Department of Science and Technology Studies at University College London.

John L. Pollock (1940–2009) was an American philosopher known for influential work in epistemology, philosophical logic, cognitive science, and artificial intelligence.

Clark N. Glymour is the Alumni University Professor Emeritus in the Department of Philosophy at Carnegie Mellon University. He is also a senior research scientist at the Florida Institute for Human and Machine Cognition.

<span class="mw-page-title-main">Stephan Hartmann</span> German philosopher (born 1968)

Stephan Hartmann is a German philosopher and Professor of Philosophy of Science at Ludwig Maximilian University of Munich, known for his contributions to formal epistemology.

Timothy Joel McGrew is a professor of philosophy at Western Michigan University, and the chair of the department of philosophy there. His research interests include epistemology, the history and philosophy of science, and the philosophy of religion. He is a specialist in the philosophical applications of probability theory.

Bayesian epistemology is a formal approach to various topics in epistemology that has its roots in Thomas Bayes' work in the field of probability theory. One advantage of its formal method in contrast to traditional epistemology is that its concepts and theorems can be defined with a high degree of precision. It is based on the idea that beliefs can be interpreted as subjective probabilities. As such, they are subject to the laws of probability theory, which act as the norms of rationality. These norms can be divided into static constraints, governing the rationality of beliefs at any moment, and dynamic constraints, governing how rational agents should change their beliefs upon receiving new evidence. The most characteristic Bayesian expression of these principles is found in the form of Dutch books, which illustrate irrationality in agents through a series of bets that lead to a loss for the agent no matter which of the probabilistic events occurs. Bayesians have applied these fundamental principles to various epistemological topics but Bayesianism does not cover all topics of traditional epistemology. The problem of confirmation in the philosophy of science, for example, can be approached through the Bayesian principle of conditionalization by holding that a piece of evidence confirms a theory if it raises the likelihood that this theory is true. Various proposals have been made to define the concept of coherence in terms of probability, usually in the sense that two propositions cohere if the probability of their conjunction is higher than if they were neutrally related to each other. The Bayesian approach has also been fruitful in the field of social epistemology, for example, concerning the problem of testimony or the problem of group belief. Bayesianism still faces various theoretical objections that have not been fully solved.

<span class="mw-page-title-main">Richard Neapolitan</span>

Richard Eugene Neapolitan was an American scientist. Neapolitan is most well-known for his role in establishing the use of probability theory in artificial intelligence and in the development of the field Bayesian networks.

References

  1. People with online papers in philosophy
  2. "Prof. Dr. Gregory Wheeler | Frankfurt School".
  3. "People - Munich Center for Mathematical Philosophy (MCMP) - LMU Munich".
  4. "Carnegie Mellon Department Of Philosophy: Faculty". www.hss.cmu.edu. Archived from the original on 2005-10-13.
  5. "Research".
  6. "Centre for Artificial Intelligence Research (CENTRIA)".
  7. "The Progic Conference Series | Jon Williamson".
  8. "Synthese".
  9. "Das Fintech, das direkt neben dem Moseleck sitzt: Exaloan". 28 June 2019.
  10. AI, NLU, and KR at the University of Rochester