Conventionalism

Last updated

Conventionalism is the philosophical attitude that fundamental principles of a certain kind are grounded on (explicit or implicit) agreements in society, rather than on external reality. Unspoken rules play a key role in the philosophy's structure. Although this attitude is commonly held with respect to the rules of grammar, its application to the propositions of ethics, law, science, biology, mathematics, and logic is more controversial.

Contents

Linguistics

The debate on linguistic conventionalism goes back to Plato's Cratylus and the philosophy of Kumārila Bhaṭṭa.[ citation needed ] It has been the standard position of modern linguistics since Ferdinand de Saussure's l'arbitraire du signe , but there have always been dissenting positions of phonosemantics, recently defended by Margaret Magnus and Vilayanur S. Ramachandran.[ citation needed ]

Philosophy of mathematics

The French mathematician Henri Poincaré was among the first to articulate a conventionalist view. Poincaré's use of non-Euclidean geometries in his work on differential equations convinced him that Euclidean geometry should not be regarded as an a priori truth. He held that axioms in geometry should be chosen for the results they produce, not for their apparent coherence with – possibly flawed – human intuitions about the physical world.

Epistemology

Conventionalism was adopted by logical positivists, chiefly A. J. Ayer and Carl Hempel, and extended to both mathematics and logic. To deny rationalism, Ayer sees two options for empiricism regarding the necessity of the truth of formal logic (and mathematics): 1) deny that they actually are necessary, and then account for why they only appear so, or 2) claim that the truths of logic and mathematics lack factual content – they are not "truths about the world" – and then explain how they are nevertheless true and informative. [1] John Stuart Mill adopted the former, which Ayer criticized, opting himself for the latter. Ayer's argument relies primarily on the analytic/synthetic distinction.

The French philosopher Pierre Duhem espoused a broader conventionalist view encompassing all of science. [2] Duhem was skeptical that human perceptions are sufficient to understand the "true," metaphysical nature of reality and argued that scientific laws should be valued mainly for their predictive power and correspondence with observations.

Karl Popper broadened the meaning of conventionalism still more. In The Logic of Scientific Discovery , he defined a "conventionalist stratagem" as any technique that is used by a theorist to evade the consequences of a falsifying observation or experiment. Popper identified four such stratagems:

Popper argued that it was crucial to avoid conventionalist stratagems if falsifiability of a theory was to be preserved. It has been argued that the standard model of cosmology is built upon a set of conventionalist stratagems. [3]

In the 1930s, a Polish philosopher Kazimierz Ajdukiewicz proposed a view that he called radical conventionalism – as opposed to moderate conventionalism developed by Henri Poincaré and Pierre Duhem. Radical conventionalism was originally outlined in The World-Picture and the Conceptual Apparatus, an article published in “Erkenntnis” in 1934. The theory can be characterized by the following theses: (1) there are languages or – as Ajdukiewicz used to say – conceptual apparatuses (schemes) which are not intertranslatable, (2) any knowledge must be articulate in one of those languages, (3) the choice of a language is arbitrary, and it is possible to change from one language to another. [4] Therefore, there is a conventional or decisional element in all knowledge (including perceptual). In his later writings – under the influence of Alfred Tarski – Ajdukiewicz rejected radical conventionalism in favour of a semantic epistemology.

Conventionalism, as applied to legal philosophy is one of the three rival conceptions of law constructed by American legal philosopher Ronald Dworkin in his work Law's Empire . The other two conceptions of law are legal pragmatism and law as integrity.

According to conventionalism as defined by Dworkin, a community's legal institutions should contain clear social conventions relied upon which rules are promulgated. Such rules will serve as the sole source of information for all the community members because they demarcate clearly all the circumstances in which state coercion will and will not be exercised.

Dworkin nonetheless has argued that this justification fails to fit with facts as there are many occasions wherein clear applicable legal rules are absent. It follows that, as he maintained, conventionalism can provide no valid ground for state coercion. Dworkin himself favored law as integrity as the best justification of state coercion.

One famous criticism of Dworkin's idea comes from Stanley Fish who argues that Dworkin, like the Critical Legal Studies movement, Marxists and adherents of feminist jurisprudence, was guilty of a false 'Theory Hope'. Fish claims that such mistake stems from their mistaken belief that there exists a general or higher 'theory' that explains or constrains all fields of activity like state coercion.

Another criticism is based on Dworkin's assertion that positivists' claims amount to conventionalism. H. L. A. Hart, as a soft positivist, denies such claim as he had pointed out that citizens cannot always discover the law as plain matter of fact. It is however unclear as to whether Joseph Raz, an avowed hard positivist, can be classified as conventionalist as Raz has claimed that law is composed "exclusively" of social facts, which could be complex, and thus difficult to be discovered.

In particular, Dworkin has characterized law as having the main function of restraining state's coercion.[ citation needed ] Nigel Simmonds has rejected Dworkin's disapproval of conventionalism, claiming that his characterization of law is too narrow.

See also

Related Research Articles

<span class="mw-page-title-main">Falsifiability</span> Property of a statement that can be logically contradicted

Falsifiability is a deductive standard of evaluation of scientific theories and hypotheses, introduced by the philosopher of science Karl Popper in his book The Logic of Scientific Discovery (1934). A theory or hypothesis is falsifiable if it can be logically contradicted by an empirical test.

<span class="mw-page-title-main">Jurisprudence</span> Theoretical study of law

Jurisprudence is the philosophy and theory of law. It is concerned primarily with what the law is and what it ought to be. That includes questions of how persons and social relations are understood in legal terms, and of the values in and of law. Work that is counted as jurisprudence is mostly philosophical, but it includes work that also belongs to other disciplines, such as sociology, history, politics and economics.

<span class="mw-page-title-main">Logical positivism</span> Movement in Western philosophy

Logical positivism, later called logical empiricism, and both of which together are also known as neopositivism, is a movement whose central thesis is the verification principle. This theory of knowledge asserted that only statements verifiable through direct observation or logical proof are meaningful in terms of conveying truth value, information or factual content. Starting in the late 1920s, groups of philosophers, scientists, and mathematicians formed the Berlin Circle and the Vienna Circle, which, in these two cities, would propound the ideas of logical positivism.

<span class="mw-page-title-main">Philosophy of law</span> Branch of philosophy examining the nature of law

Philosophy of law is a branch of philosophy that examines the nature of law and law's relationship to other systems of norms, especially ethics and political philosophy. It asks questions like "What is law?", "What are the criteria for legal validity?", and "What is the relationship between law and morality?" Philosophy of law and jurisprudence are often used interchangeably, though jurisprudence sometimes encompasses forms of reasoning that fit into economics or sociology.

<span class="mw-page-title-main">Philosophy of science</span> Study of foundations, methods, and implications of science

Philosophy of science is a branch of philosophy concerned with the foundations, methods, and implications of science. The central questions of this study concern what qualifies as science, the reliability of scientific theories, and the ultimate purpose of science. This discipline overlaps with metaphysics, ontology, and epistemology, for example, when it explores the relationship between science and truth. Philosophy of science focuses on metaphysical, epistemic and semantic aspects of science. Ethical issues such as bioethics and scientific misconduct are often considered ethics or science studies rather than the philosophy of science.

<span class="mw-page-title-main">Imre Lakatos</span> Hungarian philosopher of mathematics and science

Imre Lakatos was a Hungarian philosopher of mathematics and science, known for his thesis of the fallibility of mathematics and its "methodology of proofs and refutations" in its pre-axiomatic stages of development, and also for introducing the concept of the "research programme" in his methodology of scientific research programmes.

The philosophy of mathematics is the branch of philosophy that studies the assumptions, foundations, and implications of mathematics. It aims to understand the nature and methods of mathematics, and find out the place of mathematics in people's lives. The logical and structural nature of mathematics makes this branch of philosophy broad and unique.

<span class="mw-page-title-main">Henri Poincaré</span> French mathematician, physicist and engineer (1854–1912)

Jules Henri Poincaré was a French mathematician, theoretical physicist, engineer, and philosopher of science. He is often described as a polymath, and in mathematics as "The Last Universalist", since he excelled in all fields of the discipline as it existed during his lifetime. Due to his scientific success, influence and his discoveries, he has been deemed the "the philosopher par excellence of modern science."

In philosophy of science and in epistemology, instrumentalism is a methodological view that ideas are useful instruments, and that the worth of an idea is based on how effective it is in explaining and predicting natural phenomena. According to instrumentalists, a successful scientific theory reveals nothing known either true or false about nature's unobservable objects, properties or processes. Scientific theory is merely a tool whereby humans predict observations in a particular domain of nature by formulating laws, which state or summarize regularities, while theories themselves do not reveal supposedly hidden aspects of nature that somehow explain these laws. Instrumentalism is a perspective originally introduced by Pierre Duhem in 1906.

In philosophy of science and epistemology, the demarcation problem is the question of how to distinguish between science and non-science. It also examines the boundaries between science, pseudoscience and other products of human activity, like art and literature and beliefs. The debate continues after more than two millennia of dialogue among philosophers of science and scientists in various fields. The debate has consequences for what can be termed "scientific" in topics such as education and public policy.

"Is Logic Empirical?" is the title of two articles that discuss the idea that the algebraic properties of logic may, or should, be empirically determined; in particular, they deal with the question of whether empirical facts about quantum phenomena may provide grounds for revising classical logic as a consistent logical rendering of reality. The replacement derives from the work of Garrett Birkhoff and John von Neumann on quantum logic. In their work, they showed that the outcomes of quantum measurements can be represented as binary propositions and that these quantum mechanical propositions can be combined in a similar way as propositions in classical logic. However, the algebraic properties of this structure are somewhat different from those of classical propositional logic in that the principle of distributivity fails.

<span class="mw-page-title-main">Positivism</span> Empiricist philosophical theory

Positivism is a philosophical school that holds that all genuine knowledge is either true by definition or positive—meaning a posteriori facts derived by reason and logic from sensory experience. Other ways of knowing, such as intuition, introspection, or religious faith, are rejected or considered meaningless.

<span class="mw-page-title-main">Fallibilism</span> Philosophical principle

Originally, fallibilism is the philosophical principle that propositions can be accepted even though they cannot be conclusively proven or justified, or that neither knowledge nor belief is certain. The term was coined in the late nineteenth century by the American philosopher Charles Sanders Peirce, as a response to foundationalism. Theorists, following Austrian-British philosopher Karl Popper, may also refer to fallibilism as the notion that knowledge might turn out to be false. Furthermore, fallibilism is said to imply corrigibilism, the principle that propositions are open to revision. Fallibilism is often juxtaposed with infallibilism.

Verificationism, also known as the verification principle or the verifiability criterion of meaning, is the philosophical doctrine which asserts that a statement is meaningful only if it is either empirically verifiable or a truth of logic.

The Value of Science is a book by the French mathematician, physicist, and philosopher Henri Poincaré. It was published in 1905. The book deals with questions in the philosophy of science and adds detail to the topics addressed by Poincaré's previous book, Science and Hypothesis (1902).

Édouard Louis Emmanuel Julien Le Roy was a French philosopher and mathematician.

Inductivism is the traditional and still commonplace philosophy of scientific method to develop scientific theories. Inductivism aims to neutrally observe a domain, infer laws from examined cases—hence, inductive reasoning—and thus objectively discover the sole naturally true theory of the observed.

Jerzy Giedymin was a philosopher and historian of mathematics and science.

An index list of articles about the philosophy of science.

<span class="mw-page-title-main">Yemima Ben-Menahem</span> Israeli Philosopher

Yemima Ben-Menahem is a professor (Emerita) of philosophy at the Hebrew University of Jerusalem. Her main area of expertise is philosophy of science, in particular philosophy of modern physics.

References

  1. Ayer, Alfred Jules. Language, Truth and Logic , Dover Publications, Inc.: New York. 1952. p. 73.
  2. Yemima Ben-Menahem, Conventionalism: From Poincare to Quine, Cambridge University Press, 2006, p. 39.
  3. Merritt, David (2017). "Cosmology and convention". Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics. 57: 41–52. arXiv: 1703.02389 . Bibcode:2017SHPMP..57...41M. doi:10.1016/j.shpsb.2016.12.002. S2CID   119401938.
  4. See: J. Giedymin, Editor’s Introduction, in: K. Ajdukiewicz, The Scientific World-Perspective and Other Essays 1961-1963, ed. by J. Giedymin, “Synthese” Library, vol. 108, Dordrecht 1978, pp. XIX-XX. To this brief characterization Giedymin adds that – according to Ajdukiewicz – the nature of changes in science throughout its history is discontinuous.

Sources