Quasi-empiricism in mathematics

Last updated

Quasi-empiricism in mathematics is the attempt in the philosophy of mathematics to direct philosophers' attention to mathematical practice, in particular, relations with physics, social sciences, and computational mathematics, rather than solely to issues in the foundations of mathematics. Of concern to this discussion are several topics: the relationship of empiricism (see Penelope Maddy) with mathematics, issues related to realism, the importance of culture, necessity of application, etc.

Contents

Primary arguments

A primary argument with respect to quasi-empiricism is that whilst mathematics and physics are frequently considered to be closely linked fields of study, this may reflect human cognitive bias. It is claimed that, despite rigorous application of appropriate empirical methods or mathematical practice in either field, this would nonetheless be insufficient to disprove alternate approaches.

Eugene Wigner (1960) [1] noted that this culture need not be restricted to mathematics, physics, or even humans. He stated further that "The miracle of the appropriateness of the language of mathematics for the formulation of the laws of physics is a wonderful gift which we neither understand nor deserve. We should be grateful for it and hope that it will remain valid in future research and that it will extend, for better or for worse, to our pleasure, even though perhaps also to our bafflement, to wide branches of learning." Wigner used several examples to demonstrate why 'bafflement' is an appropriate description, such as showing how mathematics adds to situational knowledge in ways that are either not possible otherwise or are so outside normal thought to be of little notice. The predictive ability, in the sense of describing potential phenomena prior to observation of such, which can be supported by a mathematical system would be another example.

Following up on Wigner, Richard Hamming (1980) [2] wrote about applications of mathematics as a central theme to this topic and suggested that successful use can sometimes trump proof, in the following sense: where a theorem has evident veracity through applicability, later evidence that shows the theorem's proof to be problematic would result more in trying to firm up the theorem rather than in trying to redo the applications or to deny results obtained to date. Hamming had four explanations for the 'effectiveness' that we see with mathematics and definitely saw this topic as worthy of discussion and study.

  1. "We see what we look for." Why 'quasi' is apropos in reference to this discussion.
  2. "We select the kind of mathematics to use." Our use and modification of mathematics are essentially situational and goal-driven.
  3. "Science in fact answers comparatively few problems." What still needs to be looked at is a larger set.
  4. "The evolution of man provided the model." There may be limits attributable to the human element.

For Willard Van Orman Quine (1960), [3] existence is only existence in a structure. This position is relevant to quasi-empiricism because Quine believes that the same evidence that supports theorizing about the structure of the world is the same as the evidence supporting theorizing about mathematical structures. [4]

Hilary Putnam (1975) [5] stated that mathematics had accepted informal proofs and proof by authority, and had made and corrected errors all through its history. Also, he stated that Euclid's system of proving geometry theorems was unique to the classical Greeks and did not evolve similarly in other mathematical cultures in China, India, and Arabia. This and other evidence led many mathematicians to reject the label of Platonists, along with Plato's ontology   which, along with the methods and epistemology of Aristotle, had served as a foundation ontology for the Western world since its beginnings. A truly international culture of mathematics would, Putnam and others (1983) [6] argued, necessarily be at least 'quasi'-empirical (embracing 'the scientific method' for consensus if not experiment).

Imre Lakatos (1976), [7] who did his original work on this topic for his dissertation (1961, Cambridge), argued for 'research programs' as a means to support a basis for mathematics and considered thought experiments as appropriate to mathematical discovery. Lakatos may have been the first to use 'quasi-empiricism' in the context of this subject.

Operational aspects

Several recent works pertain to this topic. Gregory Chaitin's and Stephen Wolfram's work, though their positions may be considered controversial, apply. Chaitin (1997/2003) [8] suggests an underlying randomness to mathematics and Wolfram ( A New Kind of Science , 2002) [9] argues that undecidability may have practical relevance, that is, be more than an abstraction.

Another relevant addition would be the discussions concerning interactive computation, especially those related to the meaning and use of Turing's model (Church-Turing thesis, Turing machines, etc.).

These works are heavily computational and raise another set of issues. To quote Chaitin (1997/2003):

Now everything has gone topsy-turvy. It's gone topsy-turvy, not because of any philosophical argument, not because of Gödel's results or Turing's results or my own incompleteness results. It's gone topsy-turvy for a very simple reason—the computer! [8] :96

The collection of "Undecidables" in Wolfram ( A New Kind of Science , 2002) [9] is another example.

Wegner's 2006 paper "Principles of Problem Solving" [10] suggests that interactive computation can help mathematics form a more appropriate framework (empirical) than can be founded with rationalism alone. Related to this argument is that the function (even recursively related ad infinitum) is too simple a construct to handle the reality of entities that resolve (via computation or some type of analog) n-dimensional (general sense of the word) systems.

See also

Related Research Articles

In computability theory, the Church–Turing thesis is a thesis about the nature of computable functions. It states that a function on the natural numbers can be calculated by an effective method if and only if it is computable by a Turing machine. The thesis is named after American mathematician Alonzo Church and the British mathematician Alan Turing. Before the precise definition of computable function, mathematicians often used the informal term effectively calculable to describe functions that are computable by paper-and-pencil methods. In the 1930s, several independent attempts were made to formalize the notion of computability:

<span class="mw-page-title-main">Gregory Chaitin</span> Argentine-American mathematician

Gregory John Chaitin is an Argentine-American mathematician and computer scientist. Beginning in the late 1960s, Chaitin made contributions to algorithmic information theory and metamathematics, in particular a computer-theoretic result equivalent to Gödel's incompleteness theorem. He is considered to be one of the founders of what is today known as algorithmic complexity together with Andrei Kolmogorov and Ray Solomonoff. Along with the works of e.g. Solomonoff, Kolmogorov, Martin-Löf, and Leonid Levin, algorithmic information theory became a foundational part of theoretical computer science, information theory, and mathematical logic. It is a common subject in several computer science curricula. Besides computer scientists, Chaitin's work draws attention of many philosophers and mathematicians to fundamental problems in mathematical creativity and digital philosophy.

<span class="mw-page-title-main">Willard Van Orman Quine</span> American philosopher and logician (1908–2000)

Willard Van Orman Quine was an American philosopher and logician in the analytic tradition, recognized as "one of the most influential philosophers of the twentieth century". He served as the Edgar Pierce Chair of Philosophy at Harvard University from 1956 to 1978.

<span class="mw-page-title-main">Imre Lakatos</span> Hungarian philosopher of mathematics and science

Imre Lakatos was a Hungarian philosopher of mathematics and science, known for his thesis of the fallibility of mathematics and its "methodology of proofs and refutations" in its pre-axiomatic stages of development, and also for introducing the concept of the "research programme" in his methodology of scientific research programmes.

The philosophy of mathematics is the branch of philosophy that studies the assumptions, foundations, and implications of mathematics. It aims to understand the nature and methods of mathematics, and find out the place of mathematics in people's lives.

Gödel's incompleteness theorems are two theorems of mathematical logic that are concerned with the limits of provability in formal axiomatic theories. These results, published by Kurt Gödel in 1931, are important both in mathematical logic and in the philosophy of mathematics. The theorems are widely, but not universally, interpreted as showing that Hilbert's program to find a complete and consistent set of axioms for all mathematics is impossible.

<span class="mw-page-title-main">Hilary Putnam</span> American mathematician and philosopher (1926–2016)

Hilary Whitehall Putnam was an American philosopher, mathematician, computer scientist, and figure in analytic philosophy in the second half of the 20th century. He contributed to the studies of philosophy of mind, philosophy of language, philosophy of mathematics, and philosophy of science. Outside philosophy, Putnam contributed to mathematics and computer science. Together with Martin Davis he developed the Davis–Putnam algorithm for the Boolean satisfiability problem and he helped demonstrate the unsolvability of Hilbert's tenth problem.

<span class="mw-page-title-main">Mathematical proof</span> Reasoning for mathematical statements

A mathematical proof is a deductive argument for a mathematical statement, showing that the stated assumptions logically guarantee the conclusion. The argument may use other previously established statements, such as theorems; but every proof can, in principle, be constructed using only certain basic or original assumptions known as axioms, along with the accepted rules of inference. Proofs are examples of exhaustive deductive reasoning which establish logical certainty, to be distinguished from empirical arguments or non-exhaustive inductive reasoning which establish "reasonable expectation". Presenting many cases in which the statement holds is not enough for a proof, which must demonstrate that the statement is true in all possible cases. A proposition that has not been proved but is believed to be true is known as a conjecture, or a hypothesis if frequently used as an assumption for further mathematical work.

<i>A New Kind of Science</i> Book by Stephen Wolfram

A New Kind of Science is a book by Stephen Wolfram, published by his company Wolfram Research under the imprint Wolfram Media in 2002. It contains an empirical and systematic study of computational systems such as cellular automata. Wolfram calls these systems simple programs and argues that the scientific philosophy and methods appropriate for the study of simple programs are relevant to other fields of science.

Foundations of mathematics is the study of the philosophical and logical and/or algorithmic basis of mathematics, or, in a broader sense, the mathematical investigation of what underlies the philosophical theories concerning the nature of mathematics. In this latter sense, the distinction between foundations of mathematics and philosophy of mathematics turns out to be vague. Foundations of mathematics can be conceived as the study of the basic mathematical concepts and how they form hierarchies of more complex structures and concepts, especially the fundamentally important structures that form the language of mathematics also called metamathematical concepts, with an eye to the philosophical aspects and the unity of mathematics. The search for foundations of mathematics is a central question of the philosophy of mathematics; the abstract nature of mathematical objects presents special philosophical challenges.

"The Unreasonable Effectiveness of Mathematics in the Natural Sciences" is a 1960 article written by the physicist Eugene Wigner, published in Communication in Pure and Applied Mathematics. In it, Wigner observes that a theoretical physics's mathematical structure often points the way to further advances in that theory and to empirical predictions. Mathematical theories often have predictive power in describing nature.

Hypercomputation or super-Turing computation is a set of hypothetical models of computation that can provide outputs that are not Turing-computable. For example, a machine that could solve the halting problem would be a hypercomputer; so too would one that could correctly evaluate every statement in Peano arithmetic.

<span class="mw-page-title-main">Fallibilism</span> Philosophical principle

Originally, fallibilism is the philosophical principle that propositions can be accepted even though they cannot be conclusively proven or justified, or that neither knowledge nor belief is certain. The term was coined in the late nineteenth century by the American philosopher Charles Sanders Peirce, as a response to foundationalism. Theorists, following Austrian-British philosopher Karl Popper, may also refer to fallibilism as the notion that knowledge might turn out to be false. Furthermore, fallibilism is said to imply corrigibilism, the principle that propositions are open to revision. Fallibilism is often juxtaposed with infallibilism.

In mathematics, a proof of impossibility is a proof that demonstrates that a particular problem cannot be solved as described in the claim, or that a particular set of problems cannot be solved in general. Such a case is also known as a negative proof, proof of an impossibility theorem, or negative result. Proofs of impossibility often are the resolutions to decades or centuries of work attempting to find a solution, eventually proving that there is no solution. Proving that something is impossible is usually much harder than the opposite task, as it is often necessary to develop a proof that works in general, rather than to just show a particular example. Impossibility theorems are usually expressible as negative existential propositions or universal propositions in logic.

Francisco Antônio de Moraes Accioli Dória is a Brazilian mathematician, philosopher, and genealogist. Francisco Antônio Dória received his B.S. in Chemical Engineering from the Federal University of Rio de Janeiro (UFRJ), Brazil, in 1968 and then got his doctorate from the Brazilian Center for Research in Physics (CBPF), advised by Leopoldo Nachbin in 1977. Dória worked for a while at the Physics Institute of UFRJ, and then left to become a Professor of the Foundations of Communications at the School of Communications, also at UFRJ. Dória held visiting positions at the University of Rochester (NY), Stanford University (CA), and the University of São Paulo (USP). His most prolific period spawned from his collaboration with Newton da Costa, a Brazilian logician and one of the founders of paraconsistent logic, which began in 1985. He is currently Professor of Communications, Emeritus, at UFRJ and a member of the Brazilian Academy of Philosophy.

The unreasonable ineffectiveness of mathematics is a phrase that alludes to the article by physicist Eugene Wigner, "The Unreasonable Effectiveness of Mathematics in the Natural Sciences". This phrase is meant to suggest that mathematical analysis has not proved as valuable in other fields as it has in physics.

In computability theory and computational complexity theory, an undecidable problem is a decision problem for which it is proved to be impossible to construct an algorithm that always leads to a correct yes-or-no answer. The halting problem is an example: it can be proven that there is no algorithm that correctly determines whether an arbitrary program eventually halts when run.

In computability theory, the halting problem is the problem of determining, from a description of an arbitrary computer program and an input, whether the program will finish running, or continue to run forever. The halting problem is undecidable, meaning that no general algorithm exists that solves the halting problem for all possible program–input pairs.

An index list of articles about the philosophy of science.

In philosophy, unknowability is the possibility of inherently unaccessible knowledge. It addresses the epistemology of that which we cannot know. Some related concepts include the halting problem, the limits of knowledge, the unknown unknowns, and chaos theory.

References

  1. Eugene Wigner, 1960, "The Unreasonable Effectiveness of Mathematics in the Natural Sciences," Communications on Pure and Applied Mathematics 13:
  2. R. W. Hamming, 1980, The Unreasonable Effectiveness of Mathematics, The American Mathematical Monthly Volume 87 Number 2 February 1980
  3. Willard Van Orman Quine (1960), Word and Object , MIT Press, p. 22.
  4. Paul Ernest (ed.), Mathematics Education and Philosophy: An International Perspective, Routledge, 2003, p. 45.
  5. Putnam, Hilary, 1975, Mind, Language, and Reality. Philosophical Papers, Volume 2. Cambridge University Press, Cambridge, UK. ISBN   88-459-0257-9
  6. Benacerraf, Paul, and Putnam, Hilary (eds.), 1983, Philosophy of Mathematics, Selected Readings, 1st edition, Prentice–Hall, Englewood Cliffs, NJ, 1964. 2nd edition, Cambridge University Press, Cambridge, UK, 1983
  7. Lakatos, Imre (1976), Proofs and Refutations . Cambridge: Cambridge University Press. ISBN   0-521-29038-4
  8. 1 2 Chaitin, Gregory J., 1997/2003, Limits of Mathematics Archived January 1, 2006, at the Wayback Machine , Springer-Verlag, New York, NY. ISBN   1-85233-668-4
  9. 1 2 Wolfram, Stephen, 2002, A New Kind of Science (Undecidables), Wolfram Media, Chicago, IL. ISBN   1-57955-008-8
  10. Peter Wegner, Dina Goldin, 2006, "Principles of Problem Solving". Communications of the ACM 49 (2006), pp. 27–29