Understanding is a psychological process related to an abstract or physical object, such as a person, situation, or message whereby one is able to think about it and use concepts to deal adequately with that object. Understanding is a relation between the knower and an object of understanding. Understanding implies abilities and dispositions with respect to an object of knowledge that are sufficient to support intelligent behaviour.
A person is a being that has certain capacities or attributes such as reason, morality, consciousness or self-consciousness, and being a part of a culturally established form of social relations such as kinship, ownership of property, or legal responsibility. The defining features of personhood and consequently what makes a person count as a person differ widely among cultures and contexts.
A message is a discrete unit of communication intended by the source for consumption by some recipient or group of recipients. A message may be delivered by various means, including courier, telegraphy, carrier pigeon and electronic bus. A message can be the content of a broadcast. An interactive exchange of messages forms a conversation.
Concepts are mental representations, abstract objects or abilities that make up the fundamental building blocks of thoughts and beliefs. They play an important role in all aspects of cognition.
Understanding is often, though not always, related to learning concepts, and sometimes also the theory or theories associated with those concepts. However, a person may have a good ability to predict the behaviour of an object, animal or system—and therefore may, in some sense, understand it—without necessarily being familiar with the concepts or theories associated with that object, animal or system in their culture. They may have developed their own distinct concepts and theories, which may be equivalent, better or worse than the recognised standard concepts and theories of their culture. Thus, understanding is correlated with the ability to make inferences.
Inferences are steps in reasoning, moving from premises to logical consequences; etymologically, the word infer means to "carry forward". Inference is theoretically traditionally divided into deduction and induction, a distinction that in Europe dates at least to Aristotle. Deduction is inference deriving logical conclusions from premises known or assumed to be true, with the laws of valid inference being studied in logic. Induction is inference from particular premises to a universal conclusion. A third type of inference is sometimes distinguished, notably by Charles Sanders Peirce, distinguishing abduction from induction, where abduction is inference to the best explanation.
Someone who has a more sophisticated understanding, more predictively accurate understanding, and/or an understanding that allows them to make explanations that others commonly judge to be better, of something, is said to understand that thing "deeply". Conversely, someone who has a more limited understanding of a thing is said to have a "shallow" understanding. However, the depth of understanding required to usefully participate in an occupation or activity may vary greatly.
For example, consider multiplication of integers. Starting from the most shallow level of understanding, we have (at least) the following possibilities:
Multiplication is one of the four elementary mathematical operations of arithmetic; with the others being addition, subtraction and division.
An integer is a number that can be written without a fractional component. For example, 21, 4, 0, and −2048 are integers, while 9.75, 5 1/2, and √2 are not.
For the purpose of operating a cash register at McDonald's, a person does not need a very deep understanding of the multiplication involved in calculating the total price of two Big Macs. However, for the purpose of contributing to number theory research, a person would need to have a relatively deep understanding of multiplication — along with other relevant arithmetical concepts such as division and prime numbers.
A cash register, also referred to as a till in the United Kingdom and other Commonwealth countries, is a mechanical or electronic device for registering and calculating transactions at a point of sale. It is usually attached to a drawer for storing cash and other valuables. The cash register is also usually attached to a printer, that can print out receipts for record keeping purposes.
McDonald's is an American fast food company, founded in 1940 as a restaurant operated by Richard and Maurice McDonald, in San Bernardino, California, United States. They rechristened their business as a hamburger stand, and later turned the company into a franchise, with the Golden Arches logo being introduced in 1953 at a location in Phoenix, Arizona. In 1955, Ray Kroc, a businessman, joined the company as a franchise agent and proceeded to purchase the chain from the McDonald brothers. McDonald's had its original headquarters in Oak Brook, Illinois, but moved its global headquarters to Chicago in early 2018.
The Big Mac is a hamburger sold by international fast food restaurant chain McDonald's. It was introduced in the Greater Pittsburgh area, United States, in 1967 and nationwide in 1968. It is one of the company's flagship products.
It is possible for a person, or a piece of "intelligent" software, that in reality only has a shallow understanding of a topic, to appear to have a deeper understanding than they actually do, when the right questions are asked of it. The most obvious way this can happen is by memorization of correct answers to known questions, but there are other, more subtle ways that a person or computer can (intentionally or otherwise) deceive somebody about their level of understanding, too. This is particularly a risk with artificial intelligence, in which the ability of a piece of artificial intelligence software to very quickly try out millions of possibilities (attempted solutions, theories, etc.) could create a misleading impression of the real depth of its understanding. Supposed AI software could in fact come up with impressive answers to questions that were difficult for unaided humans to answer, without really understanding the concepts at all, simply by dumbly applying rules very quickly. (However, see the Chinese room argument for a controversial philosophical extension of this argument.)
Memorization is the process of committing something to memory. Mental process undertaken in order to store in memory for later recall items such as experiences, names, appointments, addresses, telephone numbers, lists, stories, poems, pictures, maps, diagrams, facts, music or other visual, auditory, or tactical information.
In computer science, artificial intelligence (AI), sometimes called machine intelligence, is intelligence demonstrated by machines, in contrast to the natural intelligence displayed by humans and other animals. Computer science defines AI research as the study of "intelligent agents": any device that perceives its environment and takes actions that maximize its chance of successfully achieving its goals. More specifically, Kaplan and Haenlein define AI as “a system’s ability to correctly interpret external data, to learn from such data, and to use those learnings to achieve specific goals and tasks through flexible adaptation”. Colloquially, the term "artificial intelligence" is used to describe machines that mimic "cognitive" functions that humans associate with other human minds, such as "learning" and "problem solving".
The Chinese room argument holds that a program cannot give a computer a "mind", "understanding" or "consciousness", regardless of how intelligently or human-like the program may make the computer behave. The argument was first presented by philosopher John Searle in his paper, "Minds, Brains, and Programs", published in Behavioral and Brain Sciences in 1980. It has been widely discussed in the years since. The central point of the argument is a thought experiment known as the Chinese room.
Examinations are designed to assess students' understanding (and sometimes also other things such as knowledge and writing abilities) without falling prey to these risks. They do this partly by asking multiple different questions about a topic to reduce the risk of measurement error, and partly by forbidding access to reference works and the outside world to reduce the risk of someone else's understanding being passed off as one's own. Because of the faster and more accurate computation and memorization abilities of computers, such tests would arguably often have to be modified if they were to be used to accurately assess the understanding of an artificial intelligence.
A test or examination is an assessment intended to measure a test-taker's knowledge, skill, aptitude, physical fitness, or classification in many other topics. A test may be administered verbally, on paper, on a computer, or in a predetermined area that requires a test taker to demonstrate or perform a set of skills. Tests vary in style, rigor and requirements. For example, in a closed book test, a test taker is usually required to rely upon memory to respond to specific items whereas in an open book test, a test taker may use one or more supplementary tools such as a reference book or calculator when responding. A test may be administered formally or informally. An example of an informal test would be a reading test administered by a parent to a child. A formal test might be a final examination administered by a teacher in a classroom or an I.Q. test administered by a psychologist in a clinic. Formal testing often results in a grade or a test score. A test score may be interpreted with regards to a norm or criterion, or occasionally both. The norm may be established independently, or by statistical analysis of a large number of participants. An exam is meant to test a persons knowledge or willingness to give time to manipulate that subject.
Conversely, it is even easier for a person or artificial intelligence to fake a shallower level of understanding than they actually have; they simply need to respond with the same kind of answers that someone with a more limited understanding, or no understanding, would respond with — such as "I don't know", or obviously wrong answers. This is relevant for judges in Turing tests; it is unlikely to be effective to simply ask the respondents to mentally calculate the answer to a very difficult arithmetical question, because the computer is likely to simply dumb itself down and pretend not to know the answer.
Gregory Chaitin, a noted computer scientist, propounds a view that comprehension is a kind of data compression.In his essay "The Limits of Reason", he argues that understanding something means being able to figure out a simple set of rules that explains it. For example, we understand why day and night exist because we have a simple model—the rotation of the earth—that explains a tremendous amount of data—changes in brightness, temperature, and atmospheric composition of the earth. We have compressed a large amount of information by using a simple model that predicts it. Similarly, we understand the number 0.33333... by thinking of it as one-third. The first way of representing the number requires five concepts ("0", "decimal point", "3", "infinity", "infinity of 3"); but the second way can produce all the data of the first representation, but uses only three concepts ("1", "division", "3"). Chaitin argues that comprehension is this ability to compress data.
Cognition is the process by which sensory inputs are transformed. Affect refers to the experience of feelings or emotions. Cognition and affect constitute understanding.
In Catholicism and Anglicanism, understanding is one of the Seven gifts of the Holy Spirit.
Discrete mathematics is the study of mathematical structures that are fundamentally discrete rather than continuous. In contrast to real numbers that have the property of varying "smoothly", the objects studied in discrete mathematics – such as integers, graphs, and statements in logic – do not vary smoothly in this way, but have distinct, separated values. Discrete mathematics therefore excludes topics in "continuous mathematics" such as calculus or Euclidean geometry. Discrete objects can often be enumerated by integers. More formally, discrete mathematics has been characterized as the branch of mathematics dealing with countable sets. However, there is no exact definition of the term "discrete mathematics." Indeed, discrete mathematics is described less by what is included than by what is excluded: continuously varying quantities and related notions.
In mathematics, an identity function, also called an identity relation or identity map or identity transformation, is a function that always returns the same value that was used as its argument. In equations, the function is given by f(x) = x.
Mathematics includes the study of such topics as quantity, structure, space, and change.
In mathematics, a group is a set equipped with a binary operation which combines any two elements to form a third element in such a way that four conditions called group axioms are satisfied, namely closure, associativity, identity and invertibility. One of the most familiar examples of a group is the set of integers together with the addition operation, but groups are encountered in numerous areas within and outside mathematics, and help focusing on essential structural aspects, by detaching them from the concrete nature of the subject of the study.
In abstract algebra, a branch of mathematics, a monoid is an algebraic structure with a single associative binary operation and an identity element.
In mathematics, the natural numbers are those used for counting and ordering. In common mathematical terminology, words colloquially used for counting are "cardinal numbers" and words connected to ordering represent "ordinal numbers". The natural numbers can, at times, appear as a convenient set of codes ; that is, as what linguists call nominal numbers, foregoing many or all of the properties of being a number in a mathematical sense.
Division is one of the four basic operations of arithmetic, the others being addition, subtraction, and multiplication. The mathematical symbols used for the division operator are the obelus (÷) and the slash (/).
Addition is one of the four basic operations of arithmetic; the others are subtraction, multiplication and division. The addition of two whole numbers is the total amount of those values combined. For example, in the adjacent picture, there is a combination of three apples and two apples together, making a total of five apples. This observation is equivalent to the mathematical expression "3 + 2 = 5" i.e., "3 add 2 is equal to 5".
Commonsense reasoning is one of the branches of artificial intelligence (AI) that is concerned with simulating the human ability to make presumptions about the type and essence of ordinary situations they encounter every day. These assumptions include judgments about the physical properties, purpose, intentions and behavior of people and objects, as well as possible outcomes of their actions and interactions. A device that exhibits commonsense reasoning will be capable of predicting results and drawing conclusions that are similar to humans' folk psychology and naive physics.
Mathematics encompasses a growing variety and depth of subjects over history, and comprehension requires a system to categorize and organize the many subjects into more general areas of mathematics. A number of different classification schemes have arisen, and though they share some similarities, there are differences due in part to the different purposes they serve. In addition, as mathematics continues to be developed, these classification schemes must change as well to account for newly created areas or newly discovered links between different areas. Classification is made more difficult by some subjects, often the most active, which straddle the boundary between different areas.
Principles and Standards for School Mathematics (PSSM) are guidelines produced by the National Council of Teachers of Mathematics (NCTM) in 2000, setting forth recommendations for mathematics educators. They form a national vision for preschool through twelfth grade mathematics education in the US and Canada. It is the primary model for standards-based mathematics.
Extremal combinatorics is a field of combinatorics, which is itself a part of mathematics. Extremal combinatorics studies how large or how small a collection of finite objects can be, if it has to satisfy certain restrictions.
Algorithmic information theory is a subfield of information theory and computer science that concerns itself with the relationship between computation and information. According to Gregory Chaitin, it is "the result of putting Shannon's information theory and Turing's computability theory into a cocktail shaker and shaking vigorously."
A proof of impossibility, also known as negative proof, proof of an impossibility theorem, or negative result, is a proof demonstrating that a particular problem cannot be solved, or cannot be solved in general. Often proofs of impossibility have put to rest decades or centuries of work attempting to find a solution. To prove that something is impossible is usually much harder than the opposite task; it is necessary to develop a theory. Impossibility theorems are usually expressible as universal propositions in logic.
Zero is an even number. In other words, its parity—the quality of an integer being even or odd—is even. This can be easily verified based on the definition of "even": it is an integer multiple of 2, specifically 0 × 2. As a result, zero shares all the properties that characterize even numbers: for example, 0 is neighbored on both sides by odd numbers, any decimal integer has the same parity as its last digit—so, since 10 is even 0 will be even, and if y is even then y + x has the same parity as x—and x and 0 + x always have the same parity.
In computability theory and computational complexity theory, an undecidable problem is a decision problem for which it is proved to be impossible to construct an algorithm that always leads to a correct yes-or-no answer. The halting problem is an example: it can be proven that there is no algorithm that correctly determines whether arbitrary programs eventually halt when run.
Algebra is one of the broad parts of mathematics, together with number theory, geometry and analysis. In its most general form, algebra is the study of mathematical symbols and the rules for manipulating these symbols; it is a unifying thread of almost all of mathematics. It includes everything from elementary equation solving to the study of abstractions such as groups, rings, and fields. The more basic parts of algebra are called elementary algebra; the more abstract parts are called abstract algebra or modern algebra. Elementary algebra is generally considered to be essential for any study of mathematics, science, or engineering, as well as such applications as medicine and economics. Abstract algebra is a major area in advanced mathematics, studied primarily by professional mathematicians.
In algebra, which is a broad division of mathematics, abstract algebra is the study of algebraic structures. Algebraic structures include groups, rings, fields, modules, vector spaces, lattices, and algebras. The term abstract algebra was coined in the early 20th century to distinguish this area of study from the other parts of algebra.
In arithmetic geometry, a Frobenioid is a category with some extra structure that generalizes the theory of line bundles on models of finite extensions of global fields. Frobenioids were introduced by Shinichi Mochizuki (2008). The word "Frobenioid" is a portmanteau of Frobenius and monoid, as certain Frobenius morphisms between Frobenioids are analogues of the usual Frobenius morphism, and some of the simplest examples of Frobenioids are essentially monoids.
|Wikiquote has quotations related to: Understanding|
|Look up understanding in Wiktionary, the free dictionary.|