Acceptability

Last updated

Acceptability is the characteristic of a thing being subject to acceptance for some purpose. A thing is acceptable if it is sufficient to serve the purpose for which it is provided, even if it is far less usable for this purpose than the ideal example. A thing is unacceptable (or has the characteristic of unacceptability) if it deviates so far from the ideal that it is no longer sufficient to serve the desired purpose, or if it goes against that purpose.

Contents

Acceptability is an amorphous concept, being both highly subjective and circumstantial; a thing may be acceptable to one evaluator and unacceptable to another, or unacceptable for one purpose but acceptable for another. Furthermore, acceptability is not necessarily a logical or consistent exercise. A thing may be sufficient to serve a particular purpose but in the subjective view of the decision maker be unacceptable for that purpose. [1] :6 Philosopher Alex Michalos writes that "[t]he concept of acceptability is as ambiguous and troublesome as probability, confirmation, belief, justice, etc.", and assigns two potential meanings to the term with respect to the possible acceptability of hypotheses. [2] Acceptability is a fundamental concept in numerous fields, including economics, [3] medicine, [4] linguistics, [5] and biometrics. [6]

Acceptable risk and acceptable loss

Concepts of acceptability that have been widely studied include acceptable risk in situations affecting human health, and acceptable loss in particularly dire situations. The idea of not increasing lifetime risk by more than one in a million has become commonplace in public health discourse and policy. [7] It is a heuristic measure. It provides a numerical basis for establishing a negligible increase in risk. Comparable concepts include an acceptable level of violence, or an acceptable daily intake of hazardous substances.

Environmental decision making allows some discretion for deeming individual risks potentially "acceptable" if there is a less than one in ten thousand chance of increased lifetime risk. Low risk criteria such as these provide some protection for a case where individuals may be exposed to multiple chemicals e.g. pollutants, food additives or other chemicals. In practice, a true zero-risk is possible only with the suppression of the risk-causing activity.

Stringent requirements of 1 in a million may not be technologically feasible or may be so prohibitively expensive as to render the risk-causing activity unsustainable, resulting in the optimal degree of intervention being a balance between risks vs. benefit. For example, emissions from hospital incinerators result in a certain number of deaths per year. However, this risk must be balanced against the alternatives. There are public health risks, as well as economic costs, associated with all options. The risk associated with no incineration is potential spread of infectious diseases, or even no hospitals. Further investigation identifies options such as separating noninfectious from infectious wastes, or air pollution controls on a medical incinerator.

Acceptable variance

Acceptable variance is the range of variance in any direction from the ideal value that remains acceptable. In project management, variance can be defined as "the difference between what is planned and what is actually achieved". [8] Degrees of variance "can be classified into negative variance, zero variance, acceptable variance, and unacceptable variance". [9] In software testing, for example, "[g]enerally 0-5% is considered as acceptable variance" from an ideal value. [9]

Acceptance testing is a practice used in chemical and engineering fields, intended to check ahead of time whether or not a thing will be acceptable. [10]

Logic and argumentation

From a logical perspective, a thing can be said to be acceptable if it has no characteristics that make it unacceptable. Various logic formulations of this principle have been developed, for example, that "a theory Δ is acceptable if for any wff α, Δ does not prove both α and ¬α", [11] and that "the acceptability of a proposition P in a system S depends on its coherence with the propositions in S". [12] Notably, Dov Gabbay, et al., have observed that something that is logically acceptable may not be subjectively acceptable to a given individual, and vice versa:

Humans may not tolerate certain theories (finding them unacceptable) even though these may be logically consistent. The human notion is more that of acceptability rather than consistency. ... To a human, resolving inconsistencies or regaining acceptability is not necessarily done by immediately 'restoring' consistency and/or acceptability but by supplying rules telling one how to act when the inconsistency or unacceptability arisis. [1] :6

"The main approaches which have been developed for reasoning within an argumentation system rely on the idea of differentiating arguments with a notion of acceptability". [13] Two models of acceptability have been developed for this purpose, one in which "[a]n acceptability level is assigned to a given argument depending on the existence of direct defeaters, or defeaters", and another in which "[a]cceptability with respect to a rational agent relies upon a notion of defense", with the complete set of arguments that a rational agent may accept being required to defend itself against any defeater. [13] Hungarian mathematician Imre Lakatos developed a concept of acceptability "taken as a measure of the approximation to the truth". [14] This concept was criticized in its applicability to philosophy as requiring that better theories first be eliminated. [14]

Despite such efforts to formulate parameters, acceptability "is a subjective construct that varies between users and in time". [15] Philosopher James B. Freeman defines acceptability as a "ternary relation between a statement, a person, and a point in time", distinguishing this view of acceptability from one according to which it is a property of statements only. [16] Philosopher David M. Godden, discussing when propositions may be acceptable to interlocutors in argument, characterizes the consensus among philosophers as follows: "[c]ommon knowledge generally provides good grounds for the acceptability of a claim, whereas popular opinion does not." [17]

Negotiation

Acceptability is a key premise of negotiation, wherein opposing sides each begin from a point of seeking their ideal solution, and compromise until they reach a solution that both sides find acceptable:

When a proposal or counter-proposal is received by an agent, it has to decide whether it is acceptable. If it is, the agent can agree to it; if not, and alternative that is acceptable to the receiving agent needs to be generated. Acceptability is determined by searching the hierarchy. If the proposal is a specification of at least one acceptable goal, the proposal is acceptable. If it is the specification of at least one unacceptable goal, the proposal is clearly unacceptable. [18]

Where an unacceptable proposal has been made, "a counterproposal is generated if there are any acceptable ones that have had already been explored". [18] Since the acceptability of proposition to a participant in a negotiation is only known to that participant, the participant may act as though a proposal that is actually acceptable to them is not, in order to obtain a more favorable proposal.

See also

Related Research Articles

Epistemology is the branch of philosophy concerned with knowledge. Epistemologists study the nature, origin, and scope of knowledge, epistemic justification, the rationality of belief, and various related issues. Debates in contemporary epistemology are generally clustered around four core areas:

<span class="mw-page-title-main">Empiricism</span> Idea that knowledge comes only/mainly from sensory experience

In philosophy, empiricism is an epistemological view which holds that true knowledge or justification comes only or primarily from sensory experience and empirical evidence. It is one of several competing views within epistemology, along with rationalism and skepticism. Empiricists argue that empiricism is a more reliable method of finding the truth than purely using logical reasoning, because humans have cognitive biases and limitations which lead to errors of judgement. Empiricism emphasizes the central role of empirical evidence in the formation of ideas, rather than innate ideas or traditions. Empiricists may argue that traditions arise due to relations of previous sensory experiences.

In metaphilosophy and ethics, metaethics is the study of the nature, scope, and meaning of moral judgment. It is one of the three branches of ethics generally studied by philosophers, the others being normative ethics and applied ethics.

Objectivism is a philosophical system named and developed by Russian-American writer and philosopher Ayn Rand. She described it as "the concept of man as a heroic being, with his own happiness as the moral purpose of his life, with productive achievement as his noblest activity, and reason as his only absolute".

Truth or verity is the property of being in accord with fact or reality. In everyday language, truth is typically ascribed to things that aim to represent reality or otherwise correspond to it, such as beliefs, propositions, and declarative sentences.

<span class="mw-page-title-main">Regress argument (epistemology)</span> Problem in epistemology that any proposition can be endlessly questioned

In epistemology, the regress argument is the argument that any proposition requires a justification. However, any justification itself requires support. This means that any proposition whatsoever can be endlessly (infinitely) questioned, resulting in infinite regress. It is a problem in epistemology and in any general situation where a statement has to be justified.

Scientific evidence is evidence that serves to either support or counter a scientific theory or hypothesis, although scientists also use evidence in other ways, such as when applying theories to practical problems. Such evidence is expected to be empirical evidence and interpretable in accordance with the scientific method. Standards for scientific evidence vary according to the field of inquiry, but the strength of scientific evidence is generally based on the results of statistical analysis and the strength of scientific controls.

In the philosophy of mathematics, logicism is a programme comprising one or more of the theses that – for some coherent meaning of 'logic' – mathematics is an extension of logic, some or all of mathematics is reducible to logic, or some or all of mathematics may be modelled in logic. Bertrand Russell and Alfred North Whitehead championed this programme, initiated by Gottlob Frege and subsequently developed by Richard Dedekind and Giuseppe Peano.

Relevance is the concept of one topic being connected to another topic in a way that makes it useful to consider the second topic when considering the first. The concept of relevance is studied in many different fields, including cognitive sciences, logic, and library and information science. Most fundamentally, however, it is studied in epistemology. Different theories of knowledge have different implications for what is considered relevant and these fundamental views have implications for all other fields as well.

Bertrand Russell makes a distinction between two different kinds of knowledge: knowledge by acquaintance and knowledge by description. Whereas knowledge by description is something like ordinary propositional knowledge, knowledge by acquaintance is familiarity with a person, place, or thing, typically obtained through perceptual experience. According to Bertrand Russell's classic account of acquaintance knowledge, acquaintance is a direct causal interaction between a person and some object that the person is perceiving.

<span class="mw-page-title-main">Applied philosophy</span> Branch of philosophy

Applied philosophy is a branch of philosophy that studies philosophical problems of practical concern. The topic covers a broad spectrum of issues in environment, medicine, science, engineering, policy, law, politics, economics and education. The term was popularised in 1982 by the founding of the Society for Applied Philosophy by Brenda Almond, and its subsequent journal publication Journal of Applied Philosophy edited by Elizabeth Brake. Methods of applied philosophy are similar to other philosophical methods including questioning, dialectic, critical discussion, rational argument, systematic presentation, thought experiments and logical argumentation.

Deontic logic is the field of philosophical logic that is concerned with obligation, permission, and related concepts. Alternatively, a deontic logic is a formal system that attempts to capture the essential logical features of these concepts. It can be used to formalize imperative logic, or directive modality in natural languages. Typically, a deontic logic uses OA to mean it is obligatory that A, and PA to mean it is permitted that A, which is defined as .

In logic, anti-psychologism is a theory about the nature of logical truth, that it does not depend upon the contents of human ideas but exists independent of human ideas.

<span class="mw-page-title-main">Münchhausen trilemma</span> A thought experiment used to demonstrate the impossibility of proving any truth

In epistemology, the Münchhausen trilemma is a thought experiment intended to demonstrate the theoretical impossibility of proving any truth, even in the fields of logic and mathematics, without appealing to accepted assumptions. If it is asked how any given proposition is known to be true, proof in support of that proposition may be provided. Yet that same question can be asked of that supporting proof, and any subsequent supporting proof. The Münchhausen trilemma is that there are only three ways of completing a proof:

Probabilistic logic involves the use of probability and logic to deal with uncertain situations. Probabilistic logic extends traditional logic truth tables with probabilistic expressions. A difficulty of probabilistic logics is their tendency to multiply the computational complexities of their probabilistic and logical components. Other difficulties include the possibility of counter-intuitive results, such as in case of belief fusion in Dempster–Shafer theory. Source trust and epistemic uncertainty about the probabilities they provide, such as defined in subjective logic, are additional elements to consider. The need to deal with a broad variety of contexts and issues has led to many different proposals.

The analytic–synthetic distinction is a semantic distinction used primarily in philosophy to distinguish between propositions that are of two types: analytic propositions and synthetic propositions. Analytic propositions are true or not true solely by virtue of their meaning, whereas synthetic propositions' truth, if any, derives from how their meaning relates to the world.

<span class="mw-page-title-main">Trivialism</span> Logical theory

Trivialism is the logical theory that all statements are true and that all contradictions of the form "p and not p" are true. In accordance with this, a trivialist is a person who believes everything is true.

Stoic logic is the system of propositional logic developed by the Stoic philosophers in ancient Greece.

<span class="mw-page-title-main">Logic</span> Study of correct reasoning

Logic is the study of correct reasoning. It includes both formal and informal logic. Formal logic is the study of deductively valid inferences or logical truths. It examines how conclusions follow from premises due to the structure of arguments alone, independent of their topic and content. Informal logic is associated with informal fallacies, critical thinking, and argumentation theory. It examines arguments expressed in natural language while formal logic uses formal language. When used as a countable noun, the term "a logic" refers to a logical formal system that articulates a proof system. Logic plays a central role in many fields, such as philosophy, mathematics, computer science, and linguistics.

Bayesian epistemology is a formal approach to various topics in epistemology that has its roots in Thomas Bayes' work in the field of probability theory. One advantage of its formal method in contrast to traditional epistemology is that its concepts and theorems can be defined with a high degree of precision. It is based on the idea that beliefs can be interpreted as subjective probabilities. As such, they are subject to the laws of probability theory, which act as the norms of rationality. These norms can be divided into static constraints, governing the rationality of beliefs at any moment, and dynamic constraints, governing how rational agents should change their beliefs upon receiving new evidence. The most characteristic Bayesian expression of these principles is found in the form of Dutch books, which illustrate irrationality in agents through a series of bets that lead to a loss for the agent no matter which of the probabilistic events occurs. Bayesians have applied these fundamental principles to various epistemological topics but Bayesianism does not cover all topics of traditional epistemology. The problem of confirmation in the philosophy of science, for example, can be approached through the Bayesian principle of conditionalization by holding that a piece of evidence confirms a theory if it raises the likelihood that this theory is true. Various proposals have been made to define the concept of coherence in terms of probability, usually in the sense that two propositions cohere if the probability of their conjunction is higher than if they were neutrally related to each other. The Bayesian approach has also been fruitful in the field of social epistemology, for example, concerning the problem of testimony or the problem of group belief. Bayesianism still faces various theoretical objections that have not been fully solved.

References

  1. 1 2 Dov M. Gabbay, Odinaldo T. Rodrigues, Alessandra Russo, Revision, Acceptability and Context: Theoretical and Algorithmic Aspects (Springer, 2010), ISBN   9783642141584.
  2. Alex C. Michalos, "Acceptability and Logical Improbability", The Popper-Carnap Controversy (2012), p. 3.
  3. Abraham Leo Gitlow, Economics (1962), p. 23" "General acceptability is fundamental. A material cannot serve as a medium of exchange if it does not possess this characteristic".
  4. Sumeet Dua, U. Rajendra Acharya, Prerna Dua, Machine Learning in Healthcare Informatics (2013), p. 133: "Acceptability. Models need to be accepted by their potential users. While partially related to transparency, acceptability requires that the models that do not contradict the knowledge of existing experts are otherwise 'reasonably' congruent with what is currently being done, and correspond to existing workflows. Acceptability is a key issue in healthcare, more than in any other industry".
  5. Abraham, W. & Braunmüller, K.: "Towards a Theory of Style and Metaphor", in Poetics 7 (1973), pp. 103-147: "Acceptability is a fundamental factor in the interpretation of dynamic competence, which is responsible for the strategies of stylistic classification".
  6. International Conference on Audio- and Video-based Biometric Person Authentication (2003), p. 802: "Acceptability: acceptability is a fundamental factor for qualifying any biometric approach".
  7. Hunter, Paul R.; Fewtrell, Lorna (2001). "Acceptable Risk" (PDF). World Health Organization.
  8. Guy L. De Furia, Project Management Recipes for Success (2008), p. 172.
  9. 1 2 Srinivasan Desikan, Software Testing: Principles and Practice (2006), p. 431.
  10. Black, Rex (August 2009). Managing the Testing Process: Practical Tools and Techniques for Managing Hardware and Software Testing . Hoboken, NJ: Wiley. ISBN   978-0-470-40415-7.
  11. Dov M. Gabbay, Odinaldo T. Rodrigues, Alessandra Russo, Revision, Acceptability and Context: Theoretical and Algorithmic Aspects (2010), p. 255.
  12. Frederick F. Schmitt, "Epistemology and Cognitive Science", in Ilkka Niiniluoto, Matti Sintonen, Jan Wolenski, Handbook of Epistemology (2004), p. 894.
  13. 1 2 Leila Amgoud and Claudette Cayrol, On the Acceptability of Arguments in Preference-Based Argumentation , arXiv:1301.7358 (2018).
  14. 1 2 W. Stegmüller, Collected Papers on Epistemology, Philosophy of Science and History of Philosophy, Volume 2 (2012), p. 104.
  15. Erwin R. van Veldhoven, Martijn H. Vastenburg, David V. Keyson, "Designing an Interactive Messaging and Reminder Display for Elderly", conference paper, European Conference on Ambient Intelligence (Springer, 2008), p. 127.
  16. Freeman 2005, p. 31.
  17. Godden, David M. (2008). "On Common Knowledge and Ad Populum: Acceptance as Grounds for Acceptability". Philosophy & Rhetoric . 41 (2): 103. doi: 10.5325/philrhet.41.2.0101 . ISSN   0031-8213. JSTOR   25655305.
  18. 1 2 L. G. Bouma, H. Velthuijsen, Feature Interactions in Telecommunications Systems (1994), p. 227.

Sources

Further reading