Markov brothers' inequality

Last updated

In mathematics, the Markov brothers' inequality is an inequality, proved in the 1890s by brothers Andrey Markov and Vladimir Markov, two Russian mathematicians. This inequality bounds the maximum of the derivatives of a polynomial on an interval in terms of the maximum of the polynomial. [1] For k = 1 it was proved by Andrey Markov, [2] and for k = 2,3,... by his brother Vladimir Markov. [3]

Contents

The statement

Let P be a polynomial of degreen. Then for all nonnegative integers

This inequality is tight, as equality is attained for Chebyshev polynomials of the first kind.

Applications

Markov's inequality is used to obtain lower bounds in computational complexity theory via the so-called "Polynomial Method".

Related Research Articles

The fundamental theorem of algebra, also called d'Alembert's theorem or the d'Alembert–Gauss theorem, states that every non-constant single-variable polynomial with complex coefficients has at least one complex root. This includes polynomials with real coefficients, since every real number is a complex number with its imaginary part equal to zero.

In probability theory, Chebyshev's inequality provides an upper bound on the probability of deviation of a random variable from its mean. More specifically, the probability that a random variable deviates from its mean by more than is at most , where is any positive constant and is the standard deviation.

<span class="mw-page-title-main">Runge's phenomenon</span> Failure of convergence in interpolation

In the mathematical field of numerical analysis, Runge's phenomenon is a problem of oscillation at the edges of an interval that occurs when using polynomial interpolation with polynomials of high degree over a set of equispaced interpolation points. It was discovered by Carl David Tolmé Runge (1901) when exploring the behavior of errors when using polynomial interpolation to approximate certain functions. The discovery shows that going to higher degrees does not always improve accuracy. The phenomenon is similar to the Gibbs phenomenon in Fourier series approximations.

In probability theory, Markov's inequality gives an upper bound on the probability that a non-negative random variable is greater than or equal to some positive constant. Markov's inequality is tight in the sense that for each chosen positive constant, there exists a random variable such that the inequality is in fact an equality.

<span class="mw-page-title-main">Bernstein polynomial</span> Type of polynomial used in Numerical Analysis

In the mathematical field of numerical analysis, a Bernstein polynomial is a polynomial expressed as a linear combination of Bernstein basis polynomials. The idea is named after mathematician Sergei Natanovich Bernstein.

In probability theory, a Chernoff bound is an exponentially decreasing upper bound on the tail of a random variable based on its moment generating function. The minimum of all such exponential bounds forms the Chernoff or Chernoff-Cramér bound, which may decay faster than exponential. It is especially useful for sums of independent random variables, such as sums of Bernoulli random variables.

In probability theory, a pairwise independent collection of random variables is a set of random variables any two of which are independent. Any collection of mutually independent random variables is pairwise independent, but some pairwise independent collections are not mutually independent. Pairwise independent random variables with finite variance are uncorrelated.

In mathematics, the Lebesgue constants (depending on a set of nodes and of its size) give an idea of how good the interpolant of a function (at the given nodes) is in comparison with the best polynomial approximation of the function (the degree of the polynomials are fixed). The Lebesgue constant for polynomials of degree at most n and for the set of n + 1 nodes T is generally denoted by Λn(T ). These constants are named after Henri Lebesgue.

In additive combinatorics, a discipline within mathematics, Freiman's theorem is a central result which indicates the approximate structure of sets whose sumset is small. It roughly states that if is small, then can be contained in a small generalized arithmetic progression.

A height function is a function that quantifies the complexity of mathematical objects. In Diophantine geometry, height functions quantify the size of solutions to Diophantine equations and are typically functions from a set of points on algebraic varieties to the real numbers.

In probability theory, Bernstein inequalities give bounds on the probability that the sum of random variables deviates from its mean. In the simplest case, let X1, ..., Xn be independent Bernoulli random variables taking values +1 and −1 with probability 1/2, then for every positive ,

In mathematics, a univariate polynomial of degree n with real or complex coefficients has n complex roots, if counted with their multiplicities. They form a multiset of n points in the complex plane. This article concerns the geometry of these points, that is the information about their localization in the complex plane that can be deduced from the degree and the coefficients of the polynomial.

In mathematical analysis, the Chebyshev–Markov–Stieltjes inequalities are inequalities related to the problem of moments that were formulated in the 1880s by Pafnuty Chebyshev and proved independently by Andrey Markov and by Thomas Jan Stieltjes. Informally, they provide sharp bounds on a measure from above and from below in terms of its first moments.

In mathematics, the Grothendieck inequality states that there is a universal constant with the following property. If Mij is an n × n matrix with

In mathematics, the Remez inequality, discovered by the Soviet mathematician Evgeny Yakovlevich Remez, gives a bound on the sup norms of certain polynomials, the bound being attained by the Chebyshev polynomials.

<span class="mw-page-title-main">Anatoly Karatsuba</span> Russian mathematician (1937–2008)

Anatoly Alexeyevich Karatsuba was a Russian mathematician working in the field of analytic number theory, p-adic numbers and Dirichlet series.

In mathematics, Welch bounds are a family of inequalities pertinent to the problem of evenly spreading a set of unit vectors in a vector space. The bounds are important tools in the design and analysis of certain methods in telecommunication engineering, particularly in coding theory. The bounds were originally published in a 1974 paper by L. R. Welch.

In probability theory, concentration inequalities provide mathematical bounds on the probability of a random variable deviating from some value.

Quantum refereed game in quantum information processing is a class of games in the general theory of quantum games. It is played between two players, Alice and Bob, and arbitrated by a referee. The referee outputs the pay-off for the players after interacting with them for a fixed number of rounds, while exchanging quantum information.

In mathematics, Bernstein's theorem is an inequality relating the maximum modulus of a complex polynomial function on the unit disk with the maximum modulus of its derivative on the unit disk. It was proven by Sergei Bernstein while he was working on approximation theory.

References

  1. Achiezer, N.I. (1992). Theory of approximation. New York: Dover Publications, Inc.
  2. Markov, A.A. (1890). "On a question by D. I. Mendeleev". Zap. Imp. Akad. Nauk. St. Petersburg. 62: 1–24.
  3. О функциях, наименее уклоняющихся от нуля в данном промежутке (On Functions of Least Deviation from Zero in a Given Interval) [1892] Appeared in German with a foreword by Sergei Bernstein as Markov, V.A. (1916). "Über Polynome, die in einem gegebenen Intervalle möglichst wenig von Null abweichen". Math. Ann. 77 (2): 213–258. doi:10.1007/bf01456902. S2CID   122406663.