Verisimilitude

Last updated

In philosophy, verisimilitude (or truthlikeness) is the notion that some propositions are closer to being true than other propositions. The problem of verisimilitude is the problem of articulating what it takes for one false theory to be closer to the truth than another false theory. [1]

Contents

This problem was central to the philosophy of Karl Popper, largely because Popper was among the first to affirm that truth is the aim of scientific inquiry while acknowledging that most of the greatest scientific theories in the history of science are, strictly speaking, false. If this long string of purportedly false theories is to constitute progress with respect to the goal of truth, then it must be at least possible for one false theory to be closer to the truth than others.

Karl Popper

Popper's formal definition of verisimilitude was challenged since 1974 by Pavel Tichý, [2] [3] John Henry Harris, [4] and David Miller, [5] who argued that Popper's definition has an unintended consequence: that no false theory can be closer to the truth than another. Popper himself stated: "I accepted the criticism of my definition within minutes of its presentation, wondering why I had not seen the mistake before." [6] This result gave rise to a search for an account of verisimilitude that did not deem progress towards the truth an impossibility.

Post-Popperian theories

Some[ which? ] of the new[ when? ] theories (e.g. those proposed by David Miller and by Theo Kuipers) build on Popper's approach, guided by the notion that truthlikeness is a function of a truth factor and a content factor. Others (e.g. those advanced by Gerhard Schurz  [ de ] in collaboration with Paul Weingartner  [ de ], by Mortensen, and by Ken Gemes) are also inspired by Popper's approach but locate what they believe to be the error of Popper's proposal in his overly generous notion of content, or consequence, proposing instead that the consequences that contribute to closeness to truth must be, in a technical sense, "relevant". A different approach (already proposed by Tichý and Risto Hilpinen  [ fi ] and developed especially by Ilkka Niiniluoto and Graham Oddie) takes the "likeness" in truthlikeness literally, holding that a proposition's likeness to the truth is a function of the overall likeness to the actual world of the possible worlds in which the proposition would be true. An attempt to use the notion of point-free metric space is proposed by Giangiacomo Gerla. [7] There is currently a debate about whether or to what extent these different approaches to the concept are compatible. [8] [9] [10]

See also

Related Research Articles

<span class="mw-page-title-main">Falsifiability</span> Property of a statement that can be logically contradicted

Falsifiability is a deductive standard of evaluation of scientific theories and hypotheses, introduced by the philosopher of science Karl Popper in his book The Logic of Scientific Discovery (1934). A theory or hypothesis is falsifiable if it can be logically contradicted by an empirical test.

<span class="mw-page-title-main">Karl Popper</span> Austrian–British philosopher of science (1902–1994)

Sir Karl Raimund Popper was an Austrian–British philosopher, academic and social commentator. One of the 20th century's most influential philosophers of science, Popper is known for his rejection of the classical inductivist views on the scientific method in favour of empirical falsification. According to Popper, a theory in the empirical sciences can never be proven, but it can be falsified, meaning that it can be scrutinised with decisive experiments. Popper was opposed to the classical justificationist account of knowledge, which he replaced with critical rationalism, namely "the first non-justificational philosophy of criticism in the history of philosophy".

In its most common sense, philosophical methodology is the field of inquiry studying the methods used to do philosophy. But the term can also refer to the methods themselves. It may be understood in a wide sense as the general study of principles used for theory selection, or in a more narrow sense as the study of ways of conducting one's research and theorizing with the goal of acquiring philosophical knowledge. Philosophical methodology investigates both descriptive issues, such as which methods actually have been used by philosophers, and normative issues, such as which methods should be used or how to do good philosophy.

In philosophy, physicalism is the metaphysical thesis that "everything is physical", that there is "nothing over and above" the physical, or that everything supervenes on the physical. Physicalism is a form of ontological monism—a "one substance" view of the nature of reality as opposed to a "two-substance" or "many-substance" (pluralism) view. Both the definition of "physical" and the meaning of physicalism have been debated.

In logic, the semantic principleof bivalence states that every declarative sentence expressing a proposition has exactly one truth value, either true or false. A logic satisfying this principle is called a two-valued logic or bivalent logic.

Truth or Verity is the property of being in accord with fact or reality. In everyday language, truth is typically ascribed to things that aim to represent reality or otherwise correspond to it, such as beliefs, propositions, and declarative sentences.

<span class="mw-page-title-main">Raven paradox</span> Paradox arising from the question of what constitutes evidence for a statement

The raven paradox, also known as Hempel's paradox, Hempel's ravens, or rarely the paradox of indoor ornithology, is a paradox arising from the question of what constitutes evidence for the truth of a statement. Observing objects that are neither black nor ravens may formally increase the likelihood that all ravens are black even though, intuitively, these observations are unrelated.

Dialectic, also known as the dialectical method, refers originally to dialogue between people holding different points of view about a subject but wishing to arrive at the truth through reasoned argumentation. Dialectic resembles debate, but the concept excludes subjective elements such as emotional appeal and rhetoric. It has its origins in ancient philosophy and continued to be developed in the Middle Ages.

<span class="mw-page-title-main">Alfred Tarski</span> Polish–American mathematician (1901–1983)

Alfred Tarski was a Polish-American logician and mathematician. A prolific author best known for his work on model theory, metamathematics, and algebraic logic, he also contributed to abstract algebra, topology, geometry, measure theory, mathematical logic, set theory, and analytic philosophy.

Deductive reasoning is the mental process of drawing deductive inferences. An inference is deductively valid if its conclusion follows logically from its premises, i.e. it is impossible for the premises to be true and the conclusion to be false.

Truthmaker theory is "the branch of metaphysics that explores the relationships between what is true and what exists". The basic intuition behind truthmaker theory is that truth depends on being. For example, a perceptual experience of a green tree may be said to be true because there actually is a green tree. But if there were no tree there, it would be false. So the experience by itself does not ensure its truth or falsehood, it depends on something else. Expressed more generally, truthmaker theory is the thesis that "the truth of truthbearers depends on the existence of truthmakers". A perceptual experience is the truthbearer in the example above. Various representational entities, like beliefs, thoughts or assertions can act as truthbearers. Truthmaker theorists are divided about what type of entity plays the role of truthmaker; popular candidates include states of affairs and tropes.

In analytic philosophy, actualism is the view that everything there is is actual. Another phrasing of the thesis is that the domain of unrestricted quantification ranges over all and only actual existents.

Commensurability is a concept in the philosophy of science whereby scientific theories are said to be "commensurable" if scientists can discuss the theories using a shared nomenclature that allows direct comparison of them to determine which one is more valid or useful. On the other hand, theories are incommensurable if they are embedded in starkly contrasting conceptual frameworks whose languages do not overlap sufficiently to permit scientists to directly compare the theories or to cite empirical evidence favoring one theory over the other. Discussed by Ludwik Fleck in the 1930s, and popularized by Thomas Kuhn in the 1960s, the problem of incommensurability results in scientists talking past each other, as it were, while comparison of theories is muddled by confusions about terms, contexts and consequences.

<span class="mw-page-title-main">Fallibilism</span> Philosophical principle

Originally, fallibilism is the philosophical principle that propositions can be accepted even though they cannot be conclusively proven or justified, or that neither knowledge nor belief is certain. The term was coined in the late nineteenth century by the American philosopher Charles Sanders Peirce, as a response to foundationalism. Theorists, following Austrian-British philosopher Karl Popper, may also refer to fallibilism as the notion that knowledge might turn out to be false. Furthermore, fallibilism is said to imply corrigibilism, the principle that propositions are open to revision. Fallibilism is often juxtaposed with infallibilism.

Logical truth is one of the most fundamental concepts in logic. Broadly speaking, a logical truth is a statement which is true regardless of the truth or falsity of its constituent propositions. In other words, a logical truth is a statement which is not only true, but one which is true under all interpretations of its logical components. Thus, logical truths such as "if p, then p" can be considered tautologies. Logical truths are thought to be the simplest case of statements which are analytically true. All of philosophical logic can be thought of as providing accounts of the nature of logical truth, as well as logical consequence.

The analytic–synthetic distinction is a semantic distinction used primarily in philosophy to distinguish between propositions that are of two types: analytic propositions and synthetic propositions. Analytic propositions are true or not true solely by virtue of their meaning, whereas synthetic propositions' truth, if any, derives from how their meaning relates to the world.

The propensity theory of probability is a probability interpretation in which the probability is thought of as a physical propensity, disposition, or tendency of a given type of situation to yield an outcome of a certain kind, or to yield a long-run relative frequency of such an outcome.

Philosophy of logic is the area of philosophy that studies the scope and nature of logic. It investigates the philosophical problems raised by logic, such as the presuppositions often implicitly at work in theories of logic and in their application. This involves questions about how logic is to be defined and how different logical systems are connected to each other. It includes the study of the nature of the fundamental concepts used by logic and the relation of logic to other disciplines. According to a common characterization, philosophical logic is the part of the philosophy of logic that studies the application of logical methods to philosophical problems, often in the form of extended logical systems like modal logic. But other theorists draw the distinction between the philosophy of logic and philosophical logic differently or not at all. Metalogic is closely related to the philosophy of logic as the discipline investigating the properties of formal logical systems, like consistency and completeness.

<span class="mw-page-title-main">Graham Oddie</span>

Graham Oddie is a New Zealand philosopher who lives and works in the United States. He has been Professor of Philosophy at the University of Colorado since 1994.

Definitions of knowledge try to determine the essential features of knowledge. Closely related terms are conception of knowledge, theory of knowledge, and analysis of knowledge. Some general features of knowledge are widely accepted among philosophers, for example, that it constitutes a cognitive success or an epistemic contact with reality and that propositional knowledge involves true belief. Most definitions of knowledge in analytic philosophy focus on propositional knowledge or knowledge-that, as in knowing that Dave is at home, in contrast to knowledge-how (know-how) expressing practical competence. However, despite the intense study of knowledge in epistemology, the disagreements about its precise nature are still both numerous and deep. Some of those disagreements arise from the fact that different theorists have different goals in mind: some try to provide a practically useful definition by delineating its most salient feature or features, while others aim at a theoretically precise definition of its necessary and sufficient conditions. Further disputes are caused by methodological differences: some theorists start from abstract and general intuitions or hypotheses, others from concrete and specific cases, and still others from linguistic usage. Additional disagreements arise concerning the standards of knowledge: whether knowledge is something rare that demands very high standards, like infallibility, or whether it is something common that requires only the possession of some evidence.

References

  1. "Truthlikeness, Stanford Encyclopedia of Philosophy" . Retrieved 2019-10-11.
  2. Pavel Tichý (June 1974). "On Popper's Definitions of Verisimilitude". The British Journal for the Philosophy of Science . Oxford University Press. 25 (2): 155–160. doi:10.1093/bjps/25.2.155. JSTOR   686819.
  3. Pavel Tichý (March 1976). "Verisimilitude Redefined". The British Journal for the Philosophy of Science. 27 (1): 25–42. doi:10.1093/bjps/27.1.25. JSTOR   686376.
  4. John H. Harris (June 1974). "Popper's Definitions of 'Verisimilitude'". The British Journal for the Philosophy of Science. 25 (2): 160–166. doi:10.1093/bjps/25.2.160. JSTOR   686820.
  5. David Miller (June 1974). "Popper's Qualitative Theory of Verisimilitude". The British Journal for the Philosophy of Science. 25 (2): 166–177. doi:10.1093/bjps/25.2.166. JSTOR   686821.
  6. Karl Popper (2013) [1983]. W. W. Bartley III (ed.). Realism and the Aim of Science. From the Postscript to the Logic of Scientific Discovery. Abingdon-on-Thames: Routledge. p.  xxxvi. ISBN   978-1-1358-5895-7.
  7. Oddie, Graham (2016). Truthlikeness. Metaphysics Research Lab, Stanford University.{{cite book}}: |work= ignored (help)
  8. Zwart S. D., Franssen M. (2007). "An impossibility theorem for verisimilitude". Synthese. 158 (1): 75–92. doi:10.1007/s11229-006-9051-y. S2CID   28812992.
  9. Oddie Graham (2013). "The content, consequence and likeness approaches to verisimilitude: compatibility, trivialization, and underdetermination". Synthese. 190 (9): 1647–1687. doi:10.1007/s11229-011-9930-8. S2CID   15527839.
  10. Gerla G (2007). "Point free geometry and verisimilitude of theories". Journal of Philosophical Logic. 36 (6): 707–733. doi:10.1007/s10992-007-9059-x. S2CID   29922810.