Relevance

Last updated

Relevance is the concept of one topic being connected to another topic in a way that makes it useful to consider the second topic when considering the first. The concept of relevance is studied in many different fields, including cognitive sciences, logic, and library and information science. Most fundamentally, however, it is studied in epistemology (the theory of knowledge). Different theories of knowledge have different implications for what is considered relevant and these fundamental views have implications for all other fields as well.

Contents

Definition

"Something (A) is relevant to a task (T) if it increases the likelihood of accomplishing the goal (G), which is implied by T." (Hjørland & Sejer Christensen, 2002). [1]

A thing might be relevant, a document or a piece of information may be relevant. The basic understanding of relevance does not depend on whether we speak of "things" or "information". For example, the Gandhian principles are of great relevance in today's world.

Epistemology

If you believe that schizophrenia is caused by bad communication between mother and child, then family interaction studies become relevant. If, on the other hand, you subscribe to a genetic theory of relevance then the study of genes becomes relevant. If you subscribe to the epistemology of empiricism, then only intersubjectively controlled observations are relevant. If, on the other hand, you subscribe to feminist epistemology, then the sex of the observer becomes relevant.

Epistemology is not just one domain among others. Epistemological views are always at play in any domain. Those views determine or influence what is regarded relevant.

Logic

Graphic of relevance in digital ecosystems Relevance.jpg
Graphic of relevance in digital ecosystems

In formal reasoning, relevance has proved an important but elusive concept. It is important because the solution of any problem requires the prior identification of the relevant elements from which a solution can be constructed. It is elusive, because the meaning of relevance appears to be difficult or impossible to capture within conventional logical systems. The obvious suggestion that q is relevant to p if q is implied by p breaks down because under standard definitions of material implication, a false proposition implies all other propositions. However though 'iron is a metal' may be implied by 'cats lay eggs' it doesn't seem to be relevant to it the way in which 'cats are mammals' and 'mammals give birth to living young' are relevant to each other. If one states "I love ice cream," and another person responds "I have a friend named Brad Cook," then these statements are not relevant. However, if one states "I love ice cream," and another person responds "I have a friend named Brad Cook who also likes ice cream," this statement now becomes relevant because it relates to the first person's idea.

Another proposal defines relevance or, more accurately, irrelevance information-theoretically. [2] It is easiest to state in terms of variables, which might reflect the values of measurable hypotheses or observation statements. The conditional entropy of an observation variable e conditioned on a variable h characterizing alternative hypotheses provides a measure of the irrelevance of the observation variable e to the set of competing hypotheses characterized by h. It is useful combined with measures of the information content of the variable e in terms of its entropy. One can then subtract the content of e that is irrelevant to h (given by its conditional entropy conditioned on h) from the total information content of e (given by its entropy) to calculate the amount of information the variable e contains about the set of hypotheses characterized by h. Relevance (via the concept of irrelevance) and information content then characterize the observation variable and can be used to measure its sensitivity and specificity (respectively) as a test for alternative hypotheses.

More recently a number of theorists[ who? ] have sought to account for relevance in terms of "possible world logics" in intensional logic. Roughly, the idea is that necessary truths are true in all possible worlds, contradictions (logical falsehoods) are true in no possible worlds, and contingent propositions can be ordered in terms of the number of possible worlds in which they are true. Relevance is argued to depend upon the "remoteness relationship" between an actual world in which relevance is being evaluated and the set of possible worlds within which it is true.

Application

Cognitive science and pragmatics

In 1986, Dan Sperber and Deirdre Wilson drew attention to the central importance of relevance decisions in reasoning and communication. They proposed an account of the process of inferring relevant information from any given utterance. To do this work, they used what they called the "Principle of Relevance": namely, the position that any utterance addressed to someone automatically conveys the presumption of its own optimal relevance. The central idea of Sperber and Wilson's theory is that all utterances are encountered in some context, and the correct interpretation of a particular utterance is the one that allows most new implications to be made in that context on the basis of the least amount of information necessary to convey it. For Sperber and Wilson, relevance is conceived as relative or subjective, as it depends upon the state of knowledge of a hearer when they encounter an utterance.

Sperber and Wilson stress that this theory is not intended to account for every intuitive application of the English word "relevance". Relevance, as a technical term, is restricted to relationships between utterances and interpretations, and so the theory cannot account for intuitions such as the one that relevance relationships obtain in problems involving physical objects. If a plumber needs to fix a leaky faucet, for example, some objects and tools are relevant (e.g. a wrench) and others are not (e.g. a waffle iron). And, moreover, the latter seems to be irrelevant in a manner which does not depend upon the plumber's knowledge, or the utterances used to describe the problem.

A theory of relevance that seems to be more readily applicable to such instances of physical problem solving has been suggested by Gorayska and Lindsay in a series of articles published during the 1990s. The key feature of their theory is the idea that relevance is goal-dependent. An item (e.g., an utterance or object) is relevant to a goal if and only if it can be an essential element of some plan capable of achieving the desired goal. This theory embraces both propositional reasoning and the problem-solving activities of people such as plumbers, and defines relevance in such a way that what is relevant is determined by the real world (because what plans will work is a matter of empirical fact) rather than the state of knowledge or belief of a particular problem solver.

Economics

The economist John Maynard Keynes saw the importance of defining relevance to the problem of calculating risk in economic decision-making. He suggested that the relevance of a piece of evidence, such as a true proposition, should be defined in terms of the changes it produces of estimations of the probability of future events. Specifically, Keynes proposed that new evidence is irrelevant to a proposition , given old evidence , if and only if , otherwise, the proposition is relevant.

There are technical problems with this definition, for example, the relevance of a piece of evidence can be sensitive to the order in which other pieces of evidence are received.

Law

The meaning of "relevance" in U.S. law is reflected in Rule 401 of the Federal Rules of Evidence. That rule defines relevance as "having any tendency to make the existence of any fact that is of consequence to the determinations of the action more probable or less probable than it would be without the evidence." In other words, if a fact were to have no bearing on the truth or falsity of a conclusion, it would be legally irrelevant.

Library and information science

This field has considered when documents (or document representations) retrieved from databases are relevant or non-relevant. Given a conception of relevance, two measures have been applied: Precision and recall:

Recall = a : (a + c) X 100%, where a = number of retrieved, relevant documents, c = number of non-retrieved, relevant documents (sometimes termed "silence"). Recall is thus an expression of how exhaustive a search for documents is.

Precision = a : (a + b) X 100%, where a = number of retrieved, relevant documents, b = number of retrieved, non-relevant documents (often termed "noise").

Precision is thus a measure of the amount of noise in document-retrieval.

Relevance itself has in the literature often been based on what is termed "the system's view" and "the user's view". Hjørland (2010) criticize these two views and defends a "subject knowledge view of relevance".

Politics

During the 1960s, relevance became a fashionable buzzword, meaning roughly 'relevance to social concerns', such as racial equality, poverty, social justice, world hunger, world economic development, and so on. The implication was that some subjects, e.g., the study of medieval poetry and the practice of corporate law, were not worthwhile because they did not address pressing social issues.[ citation needed ]

See also

Related Research Articles

An axiom, postulate, or assumption is a statement that is taken to be true, to serve as a premise or starting point for further reasoning and arguments. The word comes from the Ancient Greek word ἀξίωμα (axíōma), meaning 'that which is thought worthy or fit' or 'that which commends itself as evident'.

Information theory is the mathematical study of the quantification, storage, and communication of information. The field was originally established by the works of Harry Nyquist and Ralph Hartley, in the 1920s, and Claude Shannon in the 1940s. The field, in applied mathematics, is at the intersection of probability theory, statistics, computer science, statistical mechanics, information engineering, and electrical engineering.

<span class="mw-page-title-main">Entropy (information theory)</span> Expected amount of information needed to specify the output of a stochastic data source

In information theory, the entropy of a random variable is the average level of "information", "surprise", or "uncertainty" inherent to the variable's possible outcomes. Given a discrete random variable , which takes values in the alphabet and is distributed according to :

<span class="mw-page-title-main">Raven paradox</span> Paradox arising from the question of what constitutes evidence for a statement

The raven paradox, also known as Hempel's paradox, Hempel's ravens, or rarely the paradox of indoor ornithology, is a paradox arising from the question of what constitutes evidence for the truth of a statement. Observing objects that are neither black nor ravens may formally increase the likelihood that all ravens are black even though, intuitively, these observations are unrelated.

Deductive reasoning is the mental process of drawing deductive inferences. An inference is deductively valid if its conclusion follows logically from its premises, i.e. it is impossible for the premises to be true and the conclusion to be false.

Scientific evidence is evidence that serves to either support or counter a scientific theory or hypothesis, although scientists also use evidence in other ways, such as when applying theories to practical problems. Such evidence is expected to be empirical evidence and interpretable in accordance with the scientific method. Standards for scientific evidence vary according to the field of inquiry, but the strength of scientific evidence is generally based on the results of statistical analysis and the strength of scientific controls.

In pragmatics, a subdiscipline of linguistics, an implicature is something the speaker suggests or implies with an utterance, even though it is not literally expressed. Implicatures can aid in communicating more efficiently than by explicitly saying everything we want to communicate. The philosopher H. P. Grice coined the term in 1975. Grice distinguished conversational implicatures, which arise because speakers are expected to respect general rules of conversation, and conventional ones, which are tied to certain words such as "but" or "therefore". Take for example the following exchange:

<span class="mw-page-title-main">Material conditional</span> Logical connective

The material conditional is an operation commonly used in logic. When the conditional symbol is interpreted as material implication, a formula is true unless is true and is false. Material implication can also be characterized inferentially by modus ponens, modus tollens, conditional proof, and classical reductio ad absurdum.

Contextualism, also known as epistemic contextualism, is a family of views in philosophy which emphasize the context in which an action, utterance, or expression occurs. Proponents of contextualism argue that, in some important respect, the action, utterance, or expression can only be understood relative to that context. Contextualist views hold that philosophically controversial concepts, such as "meaning P", "knowing that P", "having a reason to A", and possibly even "being true" or "being right" only have meaning relative to a specified context. Other philosophers contend that context-dependence leads to complete relativism.

David Benjamin Kaplan is an American philosopher. He is the Hans Reichenbach Professor of Scientific Philosophy at the UCLA Department of Philosophy. His philosophical work focuses on the philosophy of language, logic, metaphysics, epistemology and the philosophy of Frege and Russell. He is best known for his work on demonstratives, propositions, and reference in intensional contexts. He was elected a Fellow of the American Academy of Arts & Sciences in 1983 and a Corresponding Fellow of the British Academy in 2007.

In physics, maximum entropy thermodynamics views equilibrium thermodynamics and statistical mechanics as inference processes. More specifically, MaxEnt applies inference techniques rooted in Shannon information theory, Bayesian probability, and the principle of maximum entropy. These techniques are relevant to any situation requiring prediction from incomplete or insufficient data. MaxEnt thermodynamics began with two papers by Edwin T. Jaynes published in the 1957 Physical Review.

<span class="mw-page-title-main">Relevance theory</span> Theory of cognitive linguistics

Relevance theory is a framework for understanding the interpretation of utterances. It was first proposed by Dan Sperber and Deirdre Wilson, and is used within cognitive linguistics and pragmatics. The theory was originally inspired by the work of Paul Grice and developed out of his ideas, but has since become a pragmatic framework in its own right. The seminal book, Relevance, was first published in 1986 and revised in 1995.

Epistemic modal logic is a subfield of modal logic that is concerned with reasoning about knowledge. While epistemology has a long philosophical tradition dating back to Ancient Greece, epistemic logic is a much more recent development with applications in many fields, including philosophy, theoretical computer science, artificial intelligence, economics and linguistics. While philosophers since Aristotle have discussed modal logic, and Medieval philosophers such as Avicenna, Ockham, and Duns Scotus developed many of their observations, it was C. I. Lewis who created the first symbolic and systematic approach to the topic, in 1912. It continued to mature as a field, reaching its modern form in 1963 with the work of Kripke.

Epistemology or theory of knowledge is the branch of philosophy concerned with the nature and scope (limitations) of knowledge. It addresses the questions "What is knowledge?", "How is knowledge acquired?", "What do people know?", "How do we know what we know?", and "Why do we know what we know?". Much of the debate in this field has focused on analyzing the nature of knowledge and how it relates to similar notions such as truth, belief, and justification. It also deals with the means of production of knowledge, as well as skepticism about different knowledge claims.

The psychology of reasoning is the study of how people reason, often broadly defined as the process of drawing conclusions to inform how people solve problems and make decisions. It overlaps with psychology, philosophy, linguistics, cognitive science, artificial intelligence, logic, and probability theory.

Evidence for a proposition is what supports the proposition. It is usually understood as an indication that the supported proposition is true. What role evidence plays and how it is conceived varies from field to field.

Explicature is a technical term in pragmatics, the branch of linguistics that concerns the meaning given to an utterance by its context. The explicatures of a sentence are what is explicitly said, often supplemented with contextual information. They contrast with implicatures, the information that the speaker conveys without actually stating it.

The Binary Independence Model (BIM) in computing and information science is a probabilistic information retrieval technique. The model makes some simple assumptions to make the estimation of document/query similarity probable and feasible.

<span class="mw-page-title-main">Deirdre Wilson</span> British linguist and cognitive scientist

Deirdre Susan Moir Wilson, FBA is a British linguist and cognitive scientist. She is emeritus professor of Linguistics at University College London and research professor at the Centre for the Study of Mind in Nature at the University of Oslo. Her most influential work has been in linguistic pragmatics—specifically in the development of Relevance Theory with French anthropologist Dan Sperber. This work has been especially influential in the Philosophy of Language. Important influences on Wilson are Noam Chomsky, Jerry Fodor, and Paul Grice. Linguists and philosophers of language who have been students of Wilson include Stephen Neale, Robyn Carston and Tim Wharton.

Definitions of knowledge try to determine the essential features of knowledge. Closely related terms are conception of knowledge, theory of knowledge, and analysis of knowledge. Some general features of knowledge are widely accepted among philosophers, for example, that it constitutes a cognitive success or an epistemic contact with reality and that propositional knowledge involves true belief. Most definitions of knowledge in analytic philosophy focus on propositional knowledge or knowledge-that, as in knowing that Dave is at home, in contrast to knowledge-how (know-how) expressing practical competence. However, despite the intense study of knowledge in epistemology, the disagreements about its precise nature are still both numerous and deep. Some of those disagreements arise from the fact that different theorists have different goals in mind: some try to provide a practically useful definition by delineating its most salient feature or features, while others aim at a theoretically precise definition of its necessary and sufficient conditions. Further disputes are caused by methodological differences: some theorists start from abstract and general intuitions or hypotheses, others from concrete and specific cases, and still others from linguistic usage. Additional disagreements arise concerning the standards of knowledge: whether knowledge is something rare that demands very high standards, like infallibility, or whether it is something common that requires only the possession of some evidence.

References

  1. Hjørland, B. & Sejer Christensen, F. (2002). Work tasks and socio-cognitive relevance: a specific example. Journal of the American Society for Information Science and Technology, 53(11), 960-965.
  2. Apgar, David (2006). Risk Intelligence. Cambridge, MA: Harvard Business Publishing.