Patrick Grim | |
---|---|
Born | Patrick Neal Grim October 29, 1950 |
Citizenship |
|
Alma mater | Boston University (PhD, 1976) University of St Andrews (BPhil, 1975) University of California, Santa Cruz (A.B. Anthropology, Philosophy, 1971) |
Era | Contemporary philosophy |
Region | Western philosophy |
Institutions | State University of New York at Stony Brook University of Michigan at Ann Arbor |
Main interests | Philosophy of religion, Computational philosophy, Philosophy of logic, Philosophy of science |
Patrick Grim is an American philosopher. He has published on epistemic questions in philosophy of religion, [1] [2] as well as topics in philosophy of science, [3] [4] philosophy of logic, [5] [6] [7] computational philosophy, [8] [9] and agent-based modeling. [10] He is author, co-author or editor of seven books in philosophical logic, philosophy of mind, philosophy of science and computational philosophy. He is currently editor of the American Philosophical Quarterly and founding co-editor of over forty volumes of The Philosopher’s Annual , an attempt to collect the ten best philosophy articles of the year. Grim's popular work includes four video lecture series on value theory, informal logic, and philosophy of mind for The Great Courses. [11] Grim's academic posts have included Distinguished Teaching Professor of Philosophy (Emeritus) at the State University of New York at Stony Brook, [12] [13] Distinguished Visiting Professor in Philosophy at the University of Michigan at Ann Arbor, and fellowships and lectureships at the Center for Complex Systems at the University of Michigan at Ann Arbor and at the Center for Philosophy of Science at the University of Pittsburgh. [12] [14] [15]
The classic picture of the philosopher is as an individual working alone with quill and paper. Formal philosophy may take the form of theorems. Grim violates that picture in two respects: he is known as an innovator in computational philosophy, working extensively with computer modeling, and his work has often been conducted using a research team. Starting with work leading up to The Philosophical Computer, Grim, Paul St. Denis, and Gary Mar used results from chaos theory and fractal geometry as an inspiration for modeling self-reference in infinite-valued logics and embodied game theory within cellular automata to obtain results regarding the evolution of cooperation and the computational universality and formal undecidability of the spatialized prisoner’s dilemma. In later work with other research teams he developed models of meaning, language acquisition, and Gricean pragmatics using simple agents embedded in a spatialized cellular automata environment of predators and prey and with learning techniques including simple imitation, localized genetic algorithm, and neural nets. Similar tools were applied with another team to questions of prejudice reduction, with an eye to the contact hypothesis in social psychology and using graphic models of model robustness. Leading a strongly cross-disciplinary team under the auspices of the Modeling Infectious Disease Agent Study, Grim developed network models of health-care belief dynamics and polarization in Black and White communities based on data from the Greater Pittsburgh Random Household Health Survey. Grim turned to models of scientific communication on epistemic landscapes, which branched out to models of differences in network information transfer by way of ‘germs, genes, and memes.’ With a particularly long-lasting research team, his work has used agent-based modeling in focussing on issues of opinion polarization, with implications for political representation structures, the dynamics of jury deliberations, and formal measures of polarization. The role of expertise in group deliberations has also been part of the picture. Most recently, and with a further research group, Grim has developed Bayesian network models of scientific theories as ‘webs of belief,’ drawing implications regarding theory-sensitivity to evidence at different points and a Kuhnian punctuated equilibrium of scientific change.
Within philosophy of religion, Grim is known for a Cantorian argument against the possibility of omniscience. In its simplest and original set-theoretic form (elaborated and buttressed in later work):
There can be no set of all truths. Given any set of truths T, there will be a power set PT of all subsets of that set. For each element of that power set there will be a unique truth: that a chosen truth t is or is not a member of that subset, for example. But by Cantor’s theorem the power set PS of any set is larger than the set S itself: any one-to-one mapping of elements of IS to elements of S is bound to leave some element of PS out. Any set of truths will therefore leave some truth out: there can be no set of all truths. But what an omniscient being would have to know would appear to be precisely a set of all truths. There can therefore be no omniscient being.
● Patrick Grim, Frank Seidl, Calum McNamara, Isabell N. Astor, and Caroline Diaso, “The Punctuated Equilibrium of Scientific Change: A Bayesian Network Model,” Synthese 200 (2022): 1-25.
● Patrick Grim, Trina Kokalis, Ali Alai-Tafti, Nick Kilb, and Paul St. Denis, "Making Meaning Happen," Journal for Experimental and Theoretical Artificial Intelligence 16 (2004), 209-244.
● Patrick Grim, "Simulating Grice: Emergent Pragmatics in Spatialized Game Theory," in Anton Benz, Christian Ebert, and Robert van Rooij, Language, Games, and Evolution, Springer-Verlag, 2011.
● Patrick Grim, Daniel J. Singer, Aaron Bramson, Bennett Holman, Sean McGeehan & William J. Berger, “Diversity, Ability and Expertise in Epistemic Communities,” Philosophy of Science 86 (2019): 98-123.
● Patrick Grim, Daniel J. Singer, Steven Fisher, Aaron Bramson, William J. Berger, Christopher Reade, Flocken and Adam Sales “Scientific Networks on Data Landscapes: Question Difficulty, Epistemic Success, and Convergence,” Episteme 10 (2013), 441-464.
● Patrick Grim, Daniel J. Singer, Christopher Reade, and Steven Fisher, “Germs, Genes, and Memes: Functional and Fitness Dynamics on Information Networks,” Philosophy of Science 82 (2015),
● Patrick Grim, "Threshold Phenomena in Epistemic Networks," Proceedings, AAAI Fall Symposium on Complex Adaptive Systems and the Threshold Effect, FS-09-03, AAAI Press 2009.
● Patrick Grim, Evan Selinger, William Braynen, Robert Rosnberger, Randy Au, Nancy Louie, and John Connolly, "Modeling Prejudice Reduction: Spatialized Game Theory and the Contact Hypothesis," Public Affairs Quarterly 19 (2005), 95-126.
● Aaron Bramson, Patrick Grim, Daniel J. Singer, William J. Berger, Graham Sack, Steven Fisher, Carissa Flocken, and Bennett Holman, “Understanding Polarization: Meanings, Measures, and Model Evaluation,” Philosophy of Science 84 (2017), 115-159.
● Patrick Grim, "The Undecidability of the Spatialized Prisoner's Dilemma," Theory and Decision 42 (1997), 53-80.
Cognitive science is the interdisciplinary, scientific study of the mind and its processes. It examines the nature, the tasks, and the functions of cognition. Mental faculties of concern to cognitive scientists include language, perception, memory, attention, reasoning, and emotion; to understand these faculties, cognitive scientists borrow from fields such as linguistics, psychology, artificial intelligence, philosophy, neuroscience, and anthropology. The typical analysis of cognitive science spans many levels of organization, from learning and decision to logic and planning; from neural circuitry to modular brain organization. One of the fundamental concepts of cognitive science is that "thinking can best be understood in terms of representational structures in the mind and computational procedures that operate on those structures."
Consciousness Explained is a 1991 book by the American philosopher Daniel Dennett, in which the author offers an account of how consciousness arises from interaction of physical and cognitive processes in the brain. Dennett describes consciousness as an account of the various calculations occurring in the brain at close to the same time. He compares consciousness to an academic paper that is being developed or edited in the hands of multiple people at one time, the "multiple drafts" theory of consciousness. In this analogy, "the paper" exists even though there is no single, unified paper. When people report on their inner experiences, Dennett considers their reports to be more like theorizing than like describing. These reports may be informative, he says, but a psychologist is not to take them at face value. Dennett describes several phenomena that show that perception is more limited and less reliable than we perceive it to be.
In philosophical epistemology, there are two types of coherentism: the coherence theory of truth, and the coherence theory of justification.
In logic, temporal logic is any system of rules and symbolism for representing, and reasoning about, propositions qualified in terms of time. It is sometimes also used to refer to tense logic, a modal logic-based system of temporal logic introduced by Arthur Prior in the late 1950s, with important contributions by Hans Kamp. It has been further developed by computer scientists, notably Amir Pnueli, and logicians.
Modal logic is a kind of logic used to represent statements about necessity and possibility. It plays a major role in philosophy and related fields as a tool for understanding concepts such as knowledge, obligation, and causation. For instance, in epistemic modal logic, the formula can be used to represent the statement that is known. In deontic modal logic, that same formula can represent that is a moral obligation.
Jerry Alan Fodor was an American philosopher and the author of many crucial works in the fields of philosophy of mind and cognitive science. His writings in these fields laid the groundwork for the modularity of mind and the language of thought hypotheses, and he is recognized as having had "an enormous influence on virtually every portion of the philosophy of mind literature since 1960." At the time of his death in 2017, he held the position of State of New Jersey Professor of Philosophy, Emeritus, at Rutgers University, and had taught previously at the City University of New York Graduate Center and MIT.
Applied philosophy is a branch of philosophy that studies philosophical problems of practical concern. The topic covers a broad spectrum of issues in environment, medicine, science, engineering, policy, law, politics, economics and education. The term was popularised in 1982 by the founding of the Society for Applied Philosophy by Brenda Almond, and its subsequent journal publication Journal of Applied Philosophy edited by Elizabeth Brake. Methods of applied philosophy are similar to other philosophical methods including questioning, dialectic, critical discussion, rational argument, systematic presentation, thought experiments and logical argumentation.
Nicholas Rescher was a German-born American philosopher, polymath, and author, who was a professor of philosophy at the University of Pittsburgh from 1961. He was chairman of the Center for Philosophy of Science and chairman of the philosophy department.
Perspectivism is the epistemological principle that perception of and knowledge of something are always bound to the interpretive perspectives of those observing it. While perspectivism does not regard all perspectives and interpretations as being of equal truth or value, it holds that no one has access to an absolute view of the world cut off from perspective. Instead, all such viewing occurs from some point of view which in turn affects how things are perceived. Rather than attempt to determine truth by correspondence to things outside any perspective, perspectivism thus generally seeks to determine truth by comparing and evaluating perspectives among themselves. Perspectivism may be regarded as an early form of epistemological pluralism, though in some accounts includes treatment of value theory, moral psychology, and realist metaphysics.
Naturalized epistemology is a collection of philosophic views concerned with the theory of knowledge that emphasize the role of natural scientific methods. This shared emphasis on scientific methods of studying knowledge shifts focus to the empirical processes of knowledge acquisition and away from many traditional philosophical questions. There are noteworthy distinctions within naturalized epistemology. Replacement naturalism maintains that traditional epistemology should be abandoned and replaced with the methodologies of the natural sciences. The general thesis of cooperative naturalism is that traditional epistemology can benefit in its inquiry by using the knowledge we have gained from the cognitive sciences. Substantive naturalism focuses on an asserted equality of facts of knowledge and natural facts.
Laurence BonJour is an American philosopher and Emeritus of Philosophy at the University of Washington.
Philosophy and economics studies topics such as public economics, behavioural economics, rationality, justice, history of economic thought, rational choice, the appraisal of economic outcomes, institutions and processes, the status of highly idealized economic models, the ontology of economic phenomena and the possibilities of acquiring knowledge of them.
Computational epistemology is a subdiscipline of formal epistemology that studies the intrinsic complexity of inductive problems for ideal and computationally bounded agents. In short, computational epistemology is to induction what recursion theory is to deduction. It has been applied to problems in philosophy of science.
Originally, fallibilism is the philosophical principle that propositions can be accepted even though they cannot be conclusively proven or justified, or that neither knowledge nor belief is certain. The term was coined in the late nineteenth century by the American philosopher Charles Sanders Peirce, as a response to foundationalism. Theorists, following Austrian-British philosopher Karl Popper, may also refer to fallibilism as the notion that knowledge might turn out to be false. Furthermore, fallibilism is said to imply corrigibilism, the principle that propositions are open to revision. Fallibilism is often juxtaposed with infallibilism.
Formal epistemology uses formal methods from decision theory, logic, probability theory and computability theory to model and reason about issues of epistemological interest. Work in this area spans several academic fields, including philosophy, computer science, economics, and statistics. The focus of formal epistemology has tended to differ somewhat from that of traditional epistemology, with topics like uncertainty, induction, and belief revision garnering more attention than the analysis of knowledge, skepticism, and issues with justification.
Probabilistic logic involves the use of probability and logic to deal with uncertain situations. Probabilistic logic extends traditional logic truth tables with probabilistic expressions. A difficulty of probabilistic logics is their tendency to multiply the computational complexities of their probabilistic and logical components. Other difficulties include the possibility of counter-intuitive results, such as in case of belief fusion in Dempster–Shafer theory. Source trust and epistemic uncertainty about the probabilities they provide, such as defined in subjective logic, are additional elements to consider. The need to deal with a broad variety of contexts and issues has led to many different proposals.
Epistemology or theory of knowledge is the branch of philosophy concerned with the nature and scope (limitations) of knowledge. It addresses the questions "What is knowledge?", "How is knowledge acquired?", "What do people know?", "How do we know what we know?", and "Why do we know what we know?". Much of the debate in this field has focused on analyzing the nature of knowledge and how it relates to similar notions such as truth, belief, and justification. It also deals with the means of production of knowledge, as well as skepticism about different knowledge claims.
Gary R. Mar is an American philosopher and logician specializing in logic, the philosophy of logic, the philosophy of mathematics, analytic philosophy, philosophy of language and linguistics, philosophy of science, computational philosophy, the philosophy of religion, and Asian American philosophy. Professor Mar is a member of the Philosophy Department at Stony Brook University. Gary Mar was the last student to have a Ph.D. directed by Alonzo Church. He is co-author with Donald Kalish and Richard Montague of the second edition of Logic: Techniques of Formal Reasoning.
In logic, a finite-valued logic is a propositional calculus in which truth values are discrete. Traditionally, in Aristotle's logic, the bivalent logic, also known as binary logic was the norm, as the law of the excluded middle precluded more than two possible values for any proposition. Modern three-valued logic allows for an additional possible truth value.
Computational philosophy or digital philosophy is the use of computational techniques in philosophy. It includes concepts such as computational models, algorithms, simulations, games, etc. that help in the research and teaching of philosophical concepts, as well as specialized online encyclopedias and graphical visualizations of relationships among philosophers and concepts. The use of computers in philosophy has gained momentum as computer power and the availability of data have increased greatly. This, along with the development of many new techniques that use those computers and data, has opened many new ways of doing philosophy that were not available before. It has also led to new insights in philosophy.