This article needs additional citations for verification .(June 2009) |
![]() A visual graph representing associative operations; | |
Type | Law, rule of replacement |
---|---|
Field | |
Symbolic statement |
|
In mathematics, the associative property [1] is a property of some binary operations that rearranging the parentheses in an expression will not change the result. In propositional logic. Associativity is a valid rule of replacement for expressions in logical proofs.
Within an expression containing two or more occurrences in a row of the same associative operator, the order in which the operations are performed does not matter as long as the sequence of the operands is not changed. That is (after rewriting the expression with parentheses and in infix notation if necessary), rearranging the parentheses in such an expression will not change its value. Consider the following equations:
Even though the parentheses were rearranged on each line, the values of the expressions were not altered. Since this holds true when performing addition and multiplication on any real numbers, it can be said that "addition and multiplication of real numbers are associative operations".
Associativity is not the same as commutativity, which addresses whether the order of two operands affects the result. For example, the order does not matter in the multiplication of real numbers, that is, a × b = b × a, so we say that the multiplication of real numbers is a commutative operation. However, operations such as function composition and matrix multiplication are associative, but not (generally) commutative.
Associative operations are abundant in mathematics; in fact, many algebraic structures (such as semigroups and categories) explicitly require their binary operations to be associative.
However, many important and interesting operations are non-associative; some examples include subtraction, exponentiation, and the vector cross product. In contrast to the theoretical properties of real numbers, the addition of floating point numbers in computer science is not associative, and the choice of how to associate an expression can have a significant effect on rounding error.
Formally, a binary operation on a set S is called associative if it satisfies the associative law:
Here, ∗ is used to replace the symbol of the operation, which may be any symbol, and even the absence of symbol (juxtaposition) as for multiplication.
The associative law can also be expressed in functional notation thus:
If a binary operation is associative, repeated application of the operation produces the same result regardless of how valid pairs of parentheses are inserted in the expression. [2] This is called the generalized associative law.
The number of possible bracketings is just the Catalan number, , for n operations on n+1 values. For instance, a product of 3 operations on 4 elements may be written (ignoring permutations of the arguments), in possible ways:
If the product operation is associative, the generalized associative law says that all these expressions will yield the same result. So unless the expression with omitted parentheses already has a different meaning (see below), the parentheses can be considered unnecessary and "the" product can be written unambiguously as
As the number of elements increases, the number of possible ways to insert parentheses grows quickly, but they remain unnecessary for disambiguation.
An example where this does not work is the logical biconditional ↔. It is associative; thus, A ↔ (B ↔ C) is equivalent to (A ↔ B) ↔ C, but A ↔ B ↔ C most commonly means (A ↔ B) and (B ↔ C), which is not equivalent.
Some examples of associative operations include the following.
"hello"
, " "
, "world"
can be computed by concatenating the first two strings (giving "hello "
) and appending the third string ("world"
), or by joining the second and third string (giving " world"
) and concatenating the first string ("hello"
) with the result. The two methods produce the same result; string concatenation is associative (but not commutative).× | A | B | C |
---|---|---|---|
A | A | A | A |
B | A | B | C |
C | A | A | A |
In standard truth-functional propositional logic, association, [4] [5] or associativity [6] are two valid rules of replacement. The rules allow one to move parentheses in logical expressions in logical proofs. The rules (using logical connectives notation) are:
and
where "" is a metalogical symbol representing "can be replaced in a proof with".
Associativity is a property of some logical connectives of truth-functional propositional logic. The following logical equivalences demonstrate that associativity is a property of particular connectives. The following (and their converses, since ↔ is commutative) are truth-functional tautologies.[ citation needed ]
Joint denial is an example of a truth functional connective that is not associative.
A binary operation on a set S that does not satisfy the associative law is called non-associative. Symbolically,
For such an operation the order of evaluation does matter. For example:
Also although addition is associative for finite sums, it is not associative inside infinite sums (series). For example, whereas
Some non-associative operations are fundamental in mathematics. They appear often as the multiplication in structures called non-associative algebras, which have also an addition and a scalar multiplication. Examples are the octonions and Lie algebras. In Lie algebras, the multiplication satisfies Jacobi identity instead of the associative law; this allows abstracting the algebraic nature of infinitesimal transformations.
Other examples are quasigroup, quasifield, non-associative ring, and commutative non-associative magmas.
In mathematics, addition and multiplication of real numbers are associative. By contrast, in computer science, addition and multiplication of floating point numbers are not associative, as different rounding errors may be introduced when dissimilar-sized values are joined in a different order. [7]
To illustrate this, consider a floating point representation with a 4-bit significand:
Even though most computers compute with 24 or 53 bits of significand, [8] this is still an important source of rounding error, and approaches such as the Kahan summation algorithm are ways to minimise the errors. It can be especially problematic in parallel computing. [9] [10]
In general, parentheses must be used to indicate the order of evaluation if a non-associative operation appears more than once in an expression (unless the notation specifies the order in another way, like ). However, mathematicians agree on a particular order of evaluation for several common non-associative operations. This is simply a notational convention to avoid parentheses.
A left-associative operation is a non-associative operation that is conventionally evaluated from left to right, i.e.,
while a right-associative operation is conventionally evaluated from right to left:
Both left-associative and right-associative operations occur. Left-associative operations include the following:
This notation can be motivated by the currying isomorphism, which enables partial application.
Right-associative operations include the following:
Exponentiation is commonly used with brackets or right-associatively because a repeated left-associative exponentiation operation is of little use. Repeated powers would mostly be rewritten with multiplication:
Formatted correctly, the superscript inherently behaves as a set of parentheses; e.g. in the expression the addition is performed before the exponentiation despite there being no explicit parentheses wrapped around it. Thus given an expression such as , the full exponent of the base is evaluated first. However, in some contexts, especially in handwriting, the difference between , and can be hard to see. In such a case, right-associativity is usually implied.
Using right-associative notation for these operations can be motivated by the Curry–Howard correspondence and by the currying isomorphism.
Non-associative operations for which no conventional evaluation order is defined include the following.
(Compare material nonimplication in logic.)
William Rowan Hamilton seems to have coined the term "associative property" [17] around 1844, a time when he was contemplating the non-associative algebra of the octonions he had learned about from John T. Graves. [18]
Definition 1.1 (i) a(bc) = (ab)c for all a, b, c in G.
If are elements of a set with an associative operation, then the product is unambiguous; this is, the same element will be obtained regardless of how parentheses are inserted in the product.