Linear function

Last updated

In mathematics, the term linear function refers to two distinct but related notions: [1]

Contents

As a polynomial function

Graphs of two linear functions. Linear Function Graph.svg
Graphs of two linear functions.

In calculus, analytic geometry and related areas, a linear function is a polynomial of degree one or less, including the zero polynomial (the latter not being considered to have degree zero).

When the function is of only one variable, it is of the form

where a and b are constants, often real numbers. The graph of such a function of one variable is a nonvertical line. a is frequently referred to as the slope of the line, and b as the intercept.

If a > 0 then the gradient is positive and the graph slopes upwards.

If a < 0 then the gradient is negative and the graph slopes downwards.

For a function of any finite number of variables, the general formula is

and the graph is a hyperplane of dimension k.

A constant function is also considered linear in this context, as it is a polynomial of degree zero or is the zero polynomial. Its graph, when there is only one variable, is a horizontal line.

In this context, a function that is also a linear map (the other meaning) may be referred to as a homogeneous linear function or a linear form. In the context of linear algebra, the polynomial functions of degree 0 or 1 are the scalar-valued affine maps.

As a linear map

The integral of a function is a linear map from the vector space of integrable functions to the real numbers. Integral as region under curve.svg
The integral of a function is a linear map from the vector space of integrable functions to the real numbers.

In linear algebra, a linear function is a map f between two vector spaces such that

Here a denotes a constant belonging to some field K of scalars (for example, the real numbers) and x and y are elements of a vector space, which might be K itself.

In other terms the linear function preserves vector addition and scalar multiplication.

Some authors use "linear function" only for linear maps that take values in the scalar field; [6] these are more commonly called linear forms.

The "linear functions" of calculus qualify as "linear maps" when (and only when) f(0, ..., 0) = 0, or, equivalently, when the constant b equals zero in the one-degree polynomial above. Geometrically, the graph of the function must pass through the origin.

See also

Notes

  1. "The term linear function means a linear form in some textbooks and an affine function in others." Vaserstein 2006, p. 50-1
  2. Stewart 2012, p. 23
  3. A. Kurosh (1975). Higher Algebra. Mir Publishers. p. 214.
  4. T. M. Apostol (1981). Mathematical Analysis. Addison-Wesley. p. 345.
  5. Shores 2007, p. 71
  6. Gelfand 1961

Related Research Articles

In mathematics, analytic geometry, also known as coordinate geometry or Cartesian geometry, is the study of geometry using a coordinate system. This contrasts with synthetic geometry.

In mathematics, the derivative is a fundamental tool that quantifies the sensitivity of change of a function's output with respect to its input. The derivative of a function of a single variable at a chosen input value, when it exists, is the slope of the tangent line to the graph of the function at that point. The tangent line is the best linear approximation of the function near that input value. For this reason, the derivative is often described as the instantaneous rate of change, the ratio of the instantaneous change in the dependent variable to that of the independent variable. The process of finding a derivative is called differentiation.

In mathematics, and more specifically in linear algebra, a linear map is a mapping between two vector spaces that preserves the operations of vector addition and scalar multiplication. The same names and the same definition are also used for the more general case of modules over a ring; see Module homomorphism.

<span class="mw-page-title-main">Basis (linear algebra)</span> Set of vectors used to define coordinates

In mathematics, a set B of vectors in a vector space V is called a basis if every element of V may be written in a unique way as a finite linear combination of elements of B. The coefficients of this linear combination are referred to as components or coordinates of the vector with respect to B. The elements of a basis are called basis vectors.

<span class="mw-page-title-main">Linear algebra</span> Branch of mathematics

Linear algebra is the branch of mathematics concerning linear equations such as:

In mathematics, a polynomial is a mathematical expression consisting of indeterminates and coefficients, that involves only the operations of addition, subtraction, multiplication and exponentiation to nonnegative integer powers, and has a finite number of terms. An example of a polynomial of a single indeterminate x is x2 − 4x + 7. An example with three indeterminates is x3 + 2xyz2yz + 1.

<span class="mw-page-title-main">Vector space</span> Algebraic structure in linear algebra

In mathematics and physics, a vector space is a set whose elements, often called vectors, can be added together and multiplied ("scaled") by numbers called scalars. The operations of vector addition and scalar multiplication must satisfy certain requirements, called vector axioms. Real vector spaces and complex vector spaces are kinds of vector spaces based on different kinds of scalars: real numbers and complex numbers. Scalars can also be, more generally, elements of any field.

In linear algebra, the trace of a square matrix A, denoted tr(A), is the sum of the elements on its main diagonal, . It is only defined for a square matrix.

In mathematics, a linear combination or superposition is an expression constructed from a set of terms by multiplying each term by a constant and adding the results. The concept of linear combinations is central to linear algebra and related fields of mathematics. Most of this article deals with linear combinations in the context of a vector space over a field, with some generalizations given at the end of the article.

In mathematics, the term linear is used in two distinct senses for two different properties:

In vector calculus, the Jacobian matrix of a vector-valued function of several variables is the matrix of all its first-order partial derivatives. When this matrix is square, that is, when the function takes the same number of variables as input as the number of vector components of its output, its determinant is referred to as the Jacobian determinant. Both the matrix and the determinant are often referred to simply as the Jacobian in literature. They are named after Carl Gustav Jacob Jacobi.

In mathematics, a linear differential equation is a differential equation that is defined by a linear polynomial in the unknown function and its derivatives, that is an equation of the form where a0(x), ..., an(x) and b(x) are arbitrary differentiable functions that do not need to be linear, and y′, ..., y(n) are the successive derivatives of an unknown function y of the variable x.

In mathematics, a homogeneous function is a function of several variables such that the following holds: If each of the function's arguments is multiplied by the same scalar, then the function's value is multiplied by some power of this scalar; the power is called the degree of homogeneity, or simply the degree. That is, if k is an integer, a function f of n variables is homogeneous of degree k if

This page lists some examples of vector spaces. See vector space for the definitions of terms used on this page. See also: dimension, basis.

In mathematics, the Veronese surface is an algebraic surface in five-dimensional projective space, and is realized by the Veronese embedding, the embedding of the projective plane given by the complete linear system of conics. It is named after Giuseppe Veronese (1854–1917). Its generalization to higher dimension is known as the Veronese variety.

In linear algebra, an eigenvector or characteristic vector is a vector that has its direction unchanged by a given linear transformation. More precisely, an eigenvector, , of a linear transformation, , is scaled by a constant factor, , when the linear transformation is applied to it: . The corresponding eigenvalue, characteristic value, or characteristic root is the multiplying factor .

Affine geometry, broadly speaking, is the study of the geometrical properties of lines, planes, and their higher dimensional analogs, in which a notion of "parallel" is retained, but no metrical notions of distance or angle are. Affine spaces differ from linear spaces in that they do not have a distinguished choice of origin. So, in the words of Marcel Berger, "An affine space is nothing more than a vector space whose origin we try to forget about, by adding translations to the linear maps." Accordingly, a complex affine space, that is an affine space over the complex numbers, is like a complex vector space, but without a distinguished point to serve as the origin.

<span class="mw-page-title-main">Matrix (mathematics)</span> Array of numbers

In mathematics, a matrix is a rectangular array or table of numbers, symbols, or expressions, with elements or entries arranged in rows and columns, which is used to represent a mathematical object or property of such an object.

Most of the terms listed in Wikipedia glossaries are already defined and explained within Wikipedia itself. However, glossaries like this one are useful for looking up, comparing and reviewing large numbers of terms together. You can help enhance this page by adding new terms or writing definitions for existing ones.

This glossary of linear algebra is a list of definitions and terms relevant to the field of linear algebra, the branch of mathematics concerned with linear equations and their representations as vector spaces.

References