# Differentiation rules

Last updated

This is a summary of differentiation rules, that is, rules for computing the derivative of a function in calculus.

## Elementary rules of differentiation

Unless otherwise stated, all functions are functions of real numbers (R) that return real values; although more generally, the formulae below apply wherever they are well defined [1] [2] — including the case of complex numbers (C). [3]

### Differentiation is linear

For any functions ${\displaystyle f}$ and ${\displaystyle g}$ and any real numbers ${\displaystyle a}$ and ${\displaystyle b}$, the derivative of the function ${\displaystyle h(x)=af(x)+bg(x)}$ with respect to ${\displaystyle x}$ is

${\displaystyle h'(x)=af'(x)+bg'(x).}$

In Leibniz's notation this is written as:

${\displaystyle {\frac {d(af+bg)}{dx}}=a{\frac {df}{dx}}+b{\frac {dg}{dx}}.}$

Special cases include:

• The constant factor rule
${\displaystyle (af)'=af'}$
• The sum rule
${\displaystyle (f+g)'=f'+g'}$
• The subtraction rule
${\displaystyle (f-g)'=f'-g'.}$

### The product rule

For the functions f and g, the derivative of the function h(x) = f(x) g(x) with respect to x is

${\displaystyle h'(x)=(fg)'(x)=f'(x)g(x)+f(x)g'(x).}$

In Leibniz's notation this is written

${\displaystyle {\frac {d(fg)}{dx}}={\frac {df}{dx}}g+f{\frac {dg}{dx}}.}$

### The chain rule

The derivative of the function ${\displaystyle h(x)=f(g(x))}$ is

${\displaystyle h'(x)=f'(g(x))\cdot g'(x).}$

In Leibniz's notation, this is written as:

${\displaystyle {\frac {d}{dx}}h(x)={\frac {d}{dz}}f(z)|_{z=g(x)}\cdot {\frac {d}{dx}}g(x),}$

often abridged to

${\displaystyle {\frac {dh(x)}{dx}}={\frac {df(g(x))}{dg(x)}}\cdot {\frac {dg(x)}{dx}}.}$

Focusing on the notion of maps, and the differential being a map ${\displaystyle {\text{D}}}$, this is written in a more concise way as:

${\displaystyle [{\text{D}}(h\circ g)]_{x}=[{\text{D}}h]_{g(x)}\cdot [{\text{D}}g]_{x}\,.}$

### The inverse function rule

If the function f has an inverse function g, meaning that ${\displaystyle g(f(x))=x}$ and ${\displaystyle f(g(y))=y,}$ then

${\displaystyle g'={\frac {1}{f'\circ g}}.}$

In Leibniz notation, this is written as

${\displaystyle {\frac {dx}{dy}}={\frac {1}{\frac {dy}{dx}}}.}$

## Power laws, polynomials, quotients, and reciprocals

### The polynomial or elementary power rule

If ${\displaystyle f(x)=x^{r}}$, for any real number ${\displaystyle r\neq 0,}$ then

${\displaystyle f'(x)=rx^{r-1}.}$

When ${\displaystyle r=1,}$ this becomes the special case that if ${\displaystyle f(x)=x,}$ then ${\displaystyle f'(x)=1.}$

Combining the power rule with the sum and constant multiple rules permits the computation of the derivative of any polynomial.

### The reciprocal rule

The derivative of ${\displaystyle h(x)={\frac {1}{f(x)}}}$for any (nonvanishing) function f is:

${\displaystyle h'(x)=-{\frac {f'(x)}{(f(x))^{2}}}}$ wherever f is non-zero.

In Leibniz's notation, this is written

${\displaystyle {\frac {d(1/f)}{dx}}=-{\frac {1}{f^{2}}}{\frac {df}{dx}}.}$

The reciprocal rule can be derived either from the quotient rule, or from the combination of power rule and chain rule.

### The quotient rule

If f and g are functions, then:

${\displaystyle \left({\frac {f}{g}}\right)'={\frac {f'g-g'f}{g^{2}}}\quad }$ wherever g is nonzero.

This can be derived from the product rule and the reciprocal rule.

### Generalized power rule

The elementary power rule generalizes considerably. The most general power rule is the functional power rule: for any functions f and g,

${\displaystyle (f^{g})'=\left(e^{g\ln f}\right)'=f^{g}\left(f'{g \over f}+g'\ln f\right),\quad }$

wherever both sides are well defined. [4]

Special cases

• If ${\textstyle f(x)=x^{a}\!}$, then ${\textstyle f'(x)=ax^{a-1}}$when a is any non-zero real number and x is positive.
• The reciprocal rule may be derived as the special case where ${\textstyle g(x)=-1\!}$.

## Derivatives of exponential and logarithmic functions

${\displaystyle {\frac {d}{dx}}\left(c^{ax}\right)={ac^{ax}\ln c},\qquad c>0}$

the equation above is true for all c, but the derivative for ${\textstyle c<0}$ yields a complex number.

${\displaystyle {\frac {d}{dx}}\left(e^{ax}\right)=ae^{ax}}$
${\displaystyle {\frac {d}{dx}}\left(\log _{c}x\right)={1 \over x\ln c},\qquad c>0,c\neq 1}$

the equation above is also true for all c, but yields a complex number if ${\textstyle c<0\!}$.

${\displaystyle {\frac {d}{dx}}\left(\ln x\right)={1 \over x},\qquad x>0.}$
${\displaystyle {\frac {d}{dx}}\left(\ln |x|\right)={1 \over x}.}$
${\displaystyle {\frac {d}{dx}}\left(x^{x}\right)=x^{x}(1+\ln x).}$
${\displaystyle {\frac {d}{dx}}\left(f(x)^{g(x)}\right)=g(x)f(x)^{g(x)-1}{\frac {df}{dx}}+f(x)^{g(x)}\ln {(f(x))}{\frac {dg}{dx}},\qquad {\text{if }}f(x)>0,{\text{ and if }}{\frac {df}{dx}}{\text{ and }}{\frac {dg}{dx}}{\text{ exist.}}}$
${\displaystyle {\frac {d}{dx}}\left(f_{1}(x)^{f_{2}(x)^{\left(...\right)^{f_{n}(x)}}}\right)=\left[\sum \limits _{k=1}^{n}{\frac {\partial }{\partial x_{k}}}\left(f_{1}(x_{1})^{f_{2}(x_{2})^{\left(...\right)^{f_{n}(x_{n})}}}\right)\right]{\biggr \vert }_{x_{1}=x_{2}=...=x_{n}=x},{\text{ if }}f_{i0{\text{ and }}}$${\displaystyle {\frac {df_{i}}{dx}}{\text{ exists. }}}$

### Logarithmic derivatives

The logarithmic derivative is another way of stating the rule for differentiating the logarithm of a function (using the chain rule):

${\displaystyle (\ln f)'={\frac {f'}{f}}\quad }$ wherever f is positive.

Logarithmic differentiation is a technique which uses logarithms and its differentiation rules to simplify certain expressions before actually applying the derivative. Logarithms can be used to remove exponents, convert products into sums, and convert division into subtraction — each of which may lead to a simplified expression for taking derivatives.

## Derivatives of trigonometric functions

 ${\displaystyle (\sin x)'=\cos x}$ ${\displaystyle (\arcsin x)'={1 \over {\sqrt {1-x^{2}}}}}$ ${\displaystyle (\cos x)'=-\sin x}$ ${\displaystyle (\arccos x)'=-{1 \over {\sqrt {1-x^{2}}}}}$ ${\displaystyle (\tan x)'=\sec ^{2}x={1 \over \cos ^{2}x}=1+\tan ^{2}x}$ ${\displaystyle (\arctan x)'={1 \over 1+x^{2}}}$ ${\displaystyle (\cot x)'=-\csc ^{2}x=-{1 \over \sin ^{2}x}=-(1+\cot ^{2}x)}$ ${\displaystyle (\operatorname {arccot} x)'=-{1 \over 1+x^{2}}}$ ${\displaystyle (\sec x)'=\tan x\sec x}$ ${\displaystyle (\operatorname {arcsec} x)'={1 \over |x|{\sqrt {x^{2}-1}}}}$ ${\displaystyle (\csc x)'=-\cot x\csc x}$ ${\displaystyle (\operatorname {arccsc} x)'=-{1 \over |x|{\sqrt {x^{2}-1}}}}$

It is common to additionally define an inverse tangent function with two arguments, ${\displaystyle \arctan(y,x)\!}$. Its value lies in the range ${\displaystyle [-\pi ,\pi ]\!}$ and reflects the quadrant of the point ${\displaystyle (x,y)\!}$. For the first and fourth quadrant (i.e. ${\displaystyle x>0\!}$) one has ${\displaystyle \arctan(y,x>0)=\arctan(y/x)\!}$. Its partial derivatives are

 ${\displaystyle {\frac {\partial \arctan(y,x)}{\partial y}}={\frac {x}{x^{2}+y^{2}}}}$, and ${\displaystyle {\frac {\partial \arctan(y,x)}{\partial x}}={\frac {-y}{x^{2}+y^{2}}}.}$

## Derivatives of hyperbolic functions

 ${\displaystyle (\sinh x)'=\cosh x={\frac {e^{x}+e^{-x}}{2}}}$ ${\displaystyle (\operatorname {arsinh} \,x)'={1 \over {\sqrt {x^{2}+1}}}}$ ${\displaystyle (\cosh x)'=\sinh x={\frac {e^{x}-e^{-x}}{2}}}$ ${\displaystyle (\operatorname {arcosh} \,x)'={\frac {1}{\sqrt {x^{2}-1}}}}$ ${\displaystyle (\tanh x)'={\operatorname {sech} ^{2}\,x}}$ ${\displaystyle (\operatorname {artanh} \,x)'={1 \over 1-x^{2}}}$ ${\displaystyle (\operatorname {coth} \,x)'=-\,\operatorname {csch} ^{2}\,x}$ ${\displaystyle (\operatorname {arcoth} \,x)'={1 \over 1-x^{2}}}$ ${\displaystyle (\operatorname {sech} \,x)'=-\tanh x\,\operatorname {sech} \,x}$ ${\displaystyle (\operatorname {arsech} \,x)'=-{1 \over x{\sqrt {1-x^{2}}}}}$ ${\displaystyle (\operatorname {csch} \,x)'=-\,\operatorname {coth} \,x\,\operatorname {csch} \,x}$ ${\displaystyle (\operatorname {arcsch} \,x)'=-{1 \over |x|{\sqrt {1+x^{2}}}}}$

See Hyperbolic functions for restrictions on these derivatives.

## Derivatives of special functions

 Gamma function ${\displaystyle \quad \Gamma (x)=\int _{0}^{\infty }t^{x-1}e^{-t}\,dt}$${\displaystyle \Gamma '(x)=\int _{0}^{\infty }t^{x-1}e^{-t}\ln t\,dt}$${\displaystyle \,=\Gamma (x)\left(\sum _{n=1}^{\infty }\left(\ln \left(1+{\dfrac {1}{n}}\right)-{\dfrac {1}{x+n}}\right)-{\dfrac {1}{x}}\right)}$${\displaystyle \,=\Gamma (x)\psi (x)}$with ${\displaystyle \psi (x)}$ being the digamma function, expressed by the parenthesized expression to the right of ${\displaystyle \Gamma (x)}$ in the line above.
 Riemann Zeta function ${\displaystyle \quad \zeta (x)=\sum _{n=1}^{\infty }{\frac {1}{n^{x}}}}$${\displaystyle \zeta '(x)=-\sum _{n=1}^{\infty }{\frac {\ln n}{n^{x}}}=-{\frac {\ln 2}{2^{x}}}-{\frac {\ln 3}{3^{x}}}-{\frac {\ln 4}{4^{x}}}-\cdots }$${\displaystyle \,=-\sum _{p{\text{ prime}}}{\frac {p^{-x}\ln p}{(1-p^{-x})^{2}}}\prod _{q{\text{ prime}},q\neq p}{\frac {1}{1-q^{-x}}}}$

## Derivatives of integrals

Suppose that it is required to differentiate with respect to x the function

${\displaystyle F(x)=\int _{a(x)}^{b(x)}f(x,t)\,dt,}$

where the functions ${\displaystyle f(x,t)}$ and ${\displaystyle {\frac {\partial }{\partial x}}\,f(x,t)}$ are both continuous in both ${\displaystyle t}$ and ${\displaystyle x}$ in some region of the ${\displaystyle (t,x)}$ plane, including ${\displaystyle a(x)\leq t\leq b(x),}$${\displaystyle x_{0}\leq x\leq x_{1}}$, and the functions ${\displaystyle a(x)}$ and ${\displaystyle b(x)}$ are both continuous and both have continuous derivatives for ${\displaystyle x_{0}\leq x\leq x_{1}}$. Then for ${\displaystyle \,x_{0}\leq x\leq x_{1}}$:

${\displaystyle F'(x)=f(x,b(x))\,b'(x)-f(x,a(x))\,a'(x)+\int _{a(x)}^{b(x)}{\frac {\partial }{\partial x}}\,f(x,t)\;dt\,.}$

This formula is the general form of the Leibniz integral rule and can be derived using the fundamental theorem of calculus.

## Derivatives to nth order

Some rules exist for computing the n-th derivative of functions, where n is a positive integer. These include:

### Faà di Bruno's formula

If f and g are n-times differentiable, then

${\displaystyle {\frac {d^{n}}{dx^{n}}}[f(g(x))]=n!\sum _{\{k_{m}\}}^{}f^{(r)}(g(x))\prod _{m=1}^{n}{\frac {1}{k_{m}!}}\left(g^{(m)}(x)\right)^{k_{m}}}$

where ${\displaystyle r=\sum _{m=1}^{n-1}k_{m}}$ and the set ${\displaystyle \{k_{m}\}}$ consists of all non-negative integer solutions of the Diophantine equation ${\displaystyle \sum _{m=1}^{n}mk_{m}=n}$.

### General Leibniz rule

If f and g are n-times differentiable, then

${\displaystyle {\frac {d^{n}}{dx^{n}}}[f(x)g(x)]=\sum _{k=0}^{n}{\binom {n}{k}}{\frac {d^{n-k}}{dx^{n-k}}}f(x){\frac {d^{k}}{dx^{k}}}g(x)}$

## Related Research Articles

In calculus, the chain rule is a formula to compute the derivative of a composite function. That is, if f and g are differentiable functions, then the chain rule expresses the derivative of their composite fg — the function which maps x to — in terms of the derivatives of f and g and the product of functions as follows:

The derivative of a function of a real variable measures the sensitivity to change of the function value with respect to a change in its argument. Derivatives are a fundamental tool of calculus. For example, the derivative of the position of a moving object with respect to time is the object's velocity: this measures how quickly the position of the object changes when time advances.

In mathematics, more specifically calculus, L'Hôpital's rule or L'Hospital's rule provides a technique to evaluate limits of indeterminate forms. Application of the rule often converts an indeterminate form to an expression that can be easily evaluated by substitution. The rule is named after the 17th-century French mathematician Guillaume de l'Hôpital. Although the rule is often attributed to L'Hôpital, the theorem was first introduced to him in 1694 by the Swiss mathematician Johann Bernoulli.

In mathematics, the mean value theorem states, roughly, that for a given planar arc between two endpoints, there is at least one point at which the tangent to the arc is parallel to the secant through its endpoints.

The natural logarithm of a number is its logarithm to the base of the mathematical constant e, where e is an irrational and transcendental number approximately equal to 2.718281828459. The natural logarithm of x is generally written as ln x, logex, or sometimes, if the base e is implicit, simply log x. Parentheses are sometimes added for clarity, giving ln(x), loge(x), or log(x). This is done in particular when the argument to the logarithm is not a single symbol, to prevent ambiguity.

In calculus, and more generally in mathematical analysis, integration by parts or partial integration is a process that finds the integral of a product of functions in terms of the integral of the product of their derivative and antiderivative. It is frequently used to transform the antiderivative of a product of functions into an antiderivative for which a solution can be more easily found. The rule can be thought of as an integral version of the product rule of differentiation.

In calculus, the power rule is used to differentiate functions of the form , whenever is a real number. Since differentiation is a linear operation on the space of differentiable functions, polynomials can also be differentiated using this rule. The power rule underlies the Taylor series as it relates a power series with a function's derivatives.

Integration is the basic operation in integral calculus. While differentiation has straightforward rules by which the derivative of a complicated function can be found by differentiating its simpler component functions, integration does not, so tables of known integrals are often useful. This page lists some of the most common antiderivatives.

In mathematics, Green's theorem gives the relationship between a line integral around a simple closed curve C and a double integral over the plane region D bounded by C. It is named after George Green, but its first proof is due to Bernhard Riemann, and it is the two-dimensional special case of the more general Kelvin–Stokes theorem.

In calculus, the product rule is a formula used to find the derivatives of products of two or more functions. It may be stated as

In mathematics, the inverse trigonometric functions are the inverse functions of the trigonometric functions. Specifically, they are the inverses of the sine, cosine, tangent, cotangent, secant, and cosecant functions, and are used to obtain an angle from any of the angle's trigonometric ratios. Inverse trigonometric functions are widely used in engineering, navigation, physics, and geometry.

In mathematics, Laplace's method, named after Pierre-Simon Laplace, is a technique used to approximate integrals of the form

In mathematics and computer algebra, automatic differentiation (AD), also called algorithmic differentiation or computational differentiation, is a set of techniques to numerically evaluate the derivative of a function specified by a computer program. AD exploits the fact that every computer program, no matter how complicated, executes a sequence of elementary arithmetic operations and elementary functions. By applying the chain rule repeatedly to these operations, derivatives of arbitrary order can be computed automatically, accurately to working precision, and using at most a small constant factor more arithmetic operations than the original program.

In mathematics, the exponential function can be characterized in many ways. The following characterizations (definitions) are most common. This article discusses why each characterization makes sense, and why the characterizations are independent of and equivalent to each other. As a special case of these considerations, it will be demonstrated that the three most common definitions given for the mathematical constant e are equivalent to each other.

In mathematics, there are several integrals known as the Dirichlet integral, after the German mathematician Peter Gustav Lejeune Dirichlet, one of which is the improper integral of the sinc function over the positive real line:

In mathematics, matrix calculus is a specialized notation for doing multivariable calculus, especially over spaces of matrices. It collects the various partial derivatives of a single function with respect to many variables, and/or of a multivariate function with respect to a single variable, into vectors and matrices that can be treated as single entities. This greatly simplifies operations such as finding the maximum or minimum of a multivariate function and solving systems of differential equations. The notation used here is commonly used in statistics and engineering, while the tensor index notation is preferred in physics.

In calculus, Leibniz's rule for differentiation under the integral sign, named after Gottfried Leibniz, states that for an integral of the form

In calculus, logarithmic differentiation or differentiation by taking logarithms is a method used to differentiate functions by employing the logarithmic derivative of a function f,

In mathematics, a quasi-analytic class of functions is a generalization of the class of real analytic functions based upon the following fact: If f is an analytic function on an interval [a,b] ⊂ R, and at some point f and all of its derivatives are zero, then f is identically zero on all of [a,b]. Quasi-analytic classes are broader classes of functions for which this statement still holds true.

Most of the terms listed in Wikipedia glossaries are already defined and explained within Wikipedia itself. However, glossaries like this one are useful for looking up, comparing and reviewing large numbers of terms together. You can help enhance this page by adding new terms or writing definitions for existing ones.

## References

1. Calculus (5th edition), F. Ayres, E. Mendelson, Schaum's Outline Series, 2009, ISBN   978-0-07-150861-2.
2. Advanced Calculus (3rd edition), R. Wrede, M.R. Spiegel, Schaum's Outline Series, 2010, ISBN   978-0-07-162366-7.
3. Complex Variables, M.R. Speigel, S. Lipschutz, J.J. Schiller, D. Spellman, Schaum's Outlines Series, McGraw Hill (USA), 2009, ISBN   978-0-07-161569-3
4. "The Exponent Rule for Derivatives". Math Vault. 2016-05-21. Retrieved 2019-07-25.