In calculus (a branch of mathematics), a differentiable function of one real variable is a function whose derivative exists at each point in its domain. In other words, the graph of a differentiable function has a non-vertical tangent line at each interior point in its domain. A differentiable function is smooth (the function is locally well approximated as a linear function at each interior point) and does not contain any break, angle, or cusp.
More generally, for x0 as an interior point in the domain of a function f, then f is said to be differentiable at x0 if and only if the derivative f ′(x0) exists. In other words, the graph of f has a non-vertical tangent line at the point (x0, f(x0)). The function f is also called locally linear at x0 as it is well approximated by a linear function near this point.
A function , defined on an open set , is differentiable at if the derivative
exists. This implies that the function is continuous at a.
This function f is differentiable on U if it is differentiable at every point of U. In this case, the derivative of f is thus a function from U into
A differentiable function is necessarily continuous (at every point where it is differentiable). It is continuously differentiable if its derivative is also a continuous function.
If f is differentiable at a point x0, then f must also be continuous at x0. In particular, any differentiable function must be continuous at every point in its domain. The converse does not hold: a continuous function need not be differentiable. For example, a function with a bend, cusp, or vertical tangent may be continuous, but fails to be differentiable at the location of the anomaly.
Most functions that occur in practice have derivatives at all points or at almost every point. However, a result of Stefan Banach states that the set of functions that have a derivative at some point is a meagre set in the space of all continuous functions.Informally, this means that differentiable functions are very atypical among continuous functions. The first known example of a function that is continuous everywhere but differentiable nowhere is the Weierstrass function.
A function f is said to be continuously differentiable if the derivative f′(x) exists and is itself a continuous function. Although the derivative of a differentiable function never has a jump discontinuity, it is possible for the derivative to have an essential discontinuity. For example, the function
is differentiable at 0, since
exists. However, for x ≠ 0, differentiation rules imply
which has no limit as x → 0. Nevertheless, Darboux's theorem implies that the derivative of any function satisfies the conclusion of the intermediate value theorem.
Continuously differentiable functions are sometimes said to be of class C1. A function is of class C2 if the first and second derivative of the function both exist and are continuous. More generally, a function is said to be of class Ck if the first k derivatives f′(x), f′′(x), ..., f (k)(x) all exist and are continuous. If derivatives f (n) exist for all positive integers n, the function is smooth or equivalently, of class C∞.
A function of several real variables f: Rm → Rn is said to be differentiable at a point x0 if there exists a linear map J: Rm → Rn such that
If a function is differentiable at x0, then all of the partial derivatives exist at x0, and the linear map J is given by the Jacobian matrix. A similar formulation of the higher-dimensional derivative is provided by the fundamental increment lemma found in single-variable calculus.
If all the partial derivatives of a function exist in a neighborhood of a point x0 and are continuous at the point x0, then the function is differentiable at that point x0.
However, the existence of the partial derivatives (or even of all the directional derivatives) does not in general guarantee that a function is differentiable at a point. For example, the function f: R2 → R defined by
is not differentiable at (0, 0), but all of the partial derivatives and directional derivatives exist at this point. For a continuous example, the function
is not differentiable at (0, 0), but again all of the partial derivatives and directional derivatives exist.
In complex analysis, complex-differentiability is defined using the same definition as single-variable real functions. This is allowed by the possibility of dividing complex numbers. So, a function is said to be differentiable at when
Although this definition looks similar to the differentiability of single-variable real functions, it is however a more restrictive condition. A function , that is complex-differentiable at a point is automatically differentiable at that point, when viewed as a function . This is because the complex-differentiability implies that
However, a function can be differentiable as a multi-variable function, while not being complex-differentiable. For example, is differentiable at every point, viewed as the 2-variable real function , but it is not complex-differentiable at any point.
Any function that is complex-differentiable in a neighborhood of a point is called holomorphic at that point. Such a function is necessarily infinitely differentiable, and in fact analytic.
If M is a differentiable manifold, a real or complex-valued function f on M is said to be differentiable at a point p if it is differentiable with respect to some (or any) coordinate chart defined around p. More generally, if M and N are differentiable manifolds, a function f: M → N is said to be differentiable at a point p if it is differentiable with respect to some (or any) coordinate charts defined around p and f(p).
In calculus, the chain rule is a formula to compute the derivative of a composite function. That is, if f and g are differentiable functions, then the chain rule expresses the derivative of their composite f∘g — the function which maps x to — in terms of the derivatives of f and g and the product of functions as follows:
In mathematics, a continuous function is a function that does not have any abrupt changes in value, known as discontinuities. More precisely, a function is continuous if arbitrarily small changes in its output can be assured by restricting to sufficiently small changes in its input. If not continuous, a function is said to be discontinuous. Up until the 19th century, mathematicians largely relied on intuitive notions of continuity, during which attempts such as the epsilon–delta definition were made to formalize it.
In mathematics, the derivative of a function of a real variable measures the sensitivity to change of the function value with respect to a change in its argument. Derivatives are a fundamental tool of calculus. For example, the derivative of the position of a moving object with respect to time is the object's velocity: this measures how quickly the position of the object changes when time advances.
In mathematics, real analysis is the branch of mathematical analysis that studies the behavior of real numbers, sequences and series of real numbers, and real functions. Some particular properties of real-valued sequences and functions that real analysis studies include convergence, limits, continuity, smoothness, differentiability and integrability.
In mathematics, a function f is uniformly continuous if, roughly speaking, it is possible to guarantee that f(x) and f(y) be as close to each other as we please by requiring only that x and y be sufficiently close to each other; unlike ordinary continuity, where the maximum distance between f(x) and f(y) may depend on x and y themselves.
In mathematics, the Dirac delta function is a generalized function or distribution introduced by physicist Paul Dirac. It is used to model the density of an idealized point mass or point charge as a function equal to zero everywhere except for zero and whose integral over the entire real line is equal to one. As there is no function that has these properties, the computations made by theoretical physicists appeared to mathematicians as nonsense until the introduction of distributions by Laurent Schwartz to formalize and validate the computations. As a distribution, the Dirac delta function is a linear functional that maps every function to its value at zero. The Kronecker delta function, which is usually defined on a discrete domain and takes values 0 and 1, is a discrete analog of the Dirac delta function.
In the calculus of variations, a field of mathematical analysis, the functional derivative relates a change in a functional to a change in a function on which the functional depends.
In mathematics, the Cauchy principal value, named after Augustin Louis Cauchy, is a method for assigning values to certain improper integrals which would otherwise be undefined.
In mathematics, the symmetry of second derivatives refers to the possibility under certain conditions of interchanging the order of taking partial derivatives of a function
In mathematics, more specifically in multivariable calculus, the implicit function theorem is a tool that allows relations to be converted to functions of several real variables. It does so by representing the relation as the graph of a function. There may not be a single function whose graph can represent the entire relation, but there may be such a function on a restriction of the domain of the relation. The implicit function theorem gives a sufficient condition to ensure that there is such a function.
The Arzelà–Ascoli theorem is a fundamental result of mathematical analysis giving necessary and sufficient conditions to decide whether every sequence of a given family of real-valued continuous functions defined on a closed and bounded interval has a uniformly convergent subsequence. The main condition is the equicontinuity of the family of functions. The theorem is the basis of many proofs in mathematics, including that of the Peano existence theorem in the theory of ordinary differential equations, Montel's theorem in complex analysis, and the Peter–Weyl theorem in harmonic analysis and various results concerning compactness of integral operators.
In mathematics and computer algebra, automatic differentiation (AD), also called algorithmic differentiation, computational differentiation, auto-differentiation, or simply autodiff, is a set of techniques to numerically evaluate the derivative of a function specified by a computer program. AD exploits the fact that every computer program, no matter how complicated, executes a sequence of elementary arithmetic operations and elementary functions. By applying the chain rule repeatedly to these operations, derivatives of arbitrary order can be computed automatically, accurately to working precision, and using at most a small constant factor more arithmetic operations than the original program.
In mathematics, the total derivative of a function f at a point is the best linear approximation near this point of the function with respect to its arguments. Unlike partial derivatives, the total derivative approximates the function with respect to all of its arguments, not just a single one. In many situations, this is the same as considering all partial derivatives simultaneously. The term "total derivative" is primarily used when f is a function of several variables, because when f is a function of a single variable, the total derivative is the same as the derivative of the function.
In mathematical analysis, and applications in geometry, applied mathematics, engineering, and natural sciences, a function of a real variable is a function whose domain is the real numbers ℝ, or a subset of ℝ that contains an interval of positive length. Most real functions that are considered and studied are differentiable in some interval. The most widely considered such functions are the real functions, which are the real-valued functions of a real variable, that is, the functions of a real variable whose codomain is the set of real numbers.
In mathematics, the derivative is a fundamental construction of differential calculus and admits many possible generalizations within the fields of mathematical analysis, combinatorics, algebra, and geometry.
In calculus, Leibniz's rule for differentiation under the integral sign, named after Gottfried Leibniz, states that for an integral of the form
In mathematics, the Fréchet derivative is a derivative defined on Banach spaces. Named after Maurice Fréchet, it is commonly used to generalize the derivative of a real-valued function of a single real variable to the case of a vector-valued function of multiple real variables, and to define the functional derivative used widely in the calculus of variations.
In calculus, a branch of mathematics, the notions of one-sided differentiability and semi-differentiability of a real-valued function f of a real variable are weaker than differentiability. Specifically, the function f is said to be right differentiable at a point a if, roughly speaking, a derivative can be defined as the function's argument x moves to a from the right, and left differentiable at a if the derivative can be defined as x moves to a from the left.
In mathematics, Fermat's theorem is a method to find local maxima and minima of differentiable functions on open sets by showing that every local extremum of the function is a stationary point. Fermat's theorem is a theorem in real analysis, named after Pierre de Fermat.
In mathematical analysis, and applications in geometry, applied mathematics, engineering, natural sciences, and economics, a function of several real variables or real multivariate function is a function with more than one argument, with all arguments being real variables. This concept extends the idea of a function of a real variable to several variables. The "input" variables take real values, while the "output", also called the "value of the function", may be real or complex. However, the study of the complex valued functions may be easily reduced to the study of the real valued functions, by considering the real and imaginary parts of the complex function; therefore, unless explicitly specified, only real valued functions will be considered in this article.