Difference polynomials

Last updated

In mathematics, in the area of complex analysis, the general difference polynomials are a polynomial sequence, a certain subclass of the Sheffer polynomials, which include the Newton polynomials, Selberg's polynomials, and the Stirling interpolation polynomials as special cases.

Contents

Definition

The general difference polynomial sequence is given by

where is the binomial coefficient. For , the generated polynomials are the Newton polynomials

The case of generates Selberg's polynomials, and the case of generates Stirling's interpolation polynomials.

Moving differences

Given an analytic function , define the moving difference of f as

where is the forward difference operator. Then, provided that f obeys certain summability conditions, then it may be represented in terms of these polynomials as

The conditions for summability (that is, convergence) for this sequence is a fairly complex topic; in general, one may say that a necessary condition is that the analytic function be of less than exponential type. Summability conditions are discussed in detail in Boas & Buck.

Generating function

The generating function for the general difference polynomials is given by

This generating function can be brought into the form of the generalized Appell representation

by setting , , and .

See also

Related Research Articles

Laplace transform Integral transform useful in probability theory, physics, and engineering

In mathematics, the Laplace transform, named after its inventor Pierre-Simon Laplace, is an integral transform that converts a function of a real variable to a function of a complex variable . The transform has many applications in science and engineering because it is a tool for solving differential equations. In particular, it transforms differential equations into algebraic equations and convolution into multiplication.

In mathematics, a product is the result of multiplying, or an expression that identifies factors to be multiplied. Thus, for instance, 30 is the product of 6 and 5, and is the product of and .

Distributions are objects that generalize the classical notion of functions in mathematical analysis. Distributions make it possible to differentiate functions whose derivatives do not exist in the classical sense. In particular, any locally integrable function has a distributional derivative. Distributions are widely used in the theory of partial differential equations, where it may be easier to establish the existence of distributional solutions than classical solutions, or appropriate classical solutions may not exist. Distributions are also important in physics and engineering where many problems naturally lead to differential equations whose solutions or initial conditions are distributions, such as the Dirac delta function.

In mathematics, a generating function is a way of encoding an infinite sequence of numbers (an) by treating them as the coefficients of a formal power series. This series is called the generating function of the sequence. Unlike an ordinary series, the formal power series is not required to converge: in fact, the generating function is not actually regarded as a function, and the "variable" remains an indeterminate. Generating functions were first introduced by Abraham de Moivre in 1730, in order to solve the general linear recurrence problem. One can generalize to formal power series in more than one indeterminate, to encode information about infinite multi-dimensional arrays of numbers.

In mathematics and signal processing, the Z-transform converts a discrete-time signal, which is a sequence of real or complex numbers, into a complex frequency-domain representation.

In mathematics, the Hermite polynomials are a classical orthogonal polynomial sequence.

Beta distribution Probability distribution

In probability theory and statistics, the beta distribution is a family of continuous probability distributions defined on the interval [0, 1] parameterized by two positive shape parameters, denoted by α and β, that appear as exponents of the random variable and control the shape of the distribution. The generalization to multiple variables is called a Dirichlet distribution.

In combinatorics, especially in analytic combinatorics, the symbolic method is a technique for counting combinatorial objects. It uses the internal structure of the objects to derive formulas for their generating functions. The method is mostly associated with Philippe Flajolet and is detailed in Part A of his book with Robert Sedgewick, Analytic Combinatorics. Similar languages for specifying combinatorial classes and their generating functions are found in work by Bender and Goldman, Foata and Schützenberger, and Joyal. The presentation in this article borrows somewhat from Joyal's combinatorial species.

In mathematics, a polynomial sequence has a generalized Appell representation if the generating function for the polynomials takes on a certain form:

In mathematics, in particular in algebraic geometry and differential geometry, Dolbeault cohomology is an analog of de Rham cohomology for complex manifolds. Let M be a complex manifold. Then the Dolbeault cohomology groups depend on a pair of integers p and q and are realized as a subquotient of the space of complex differential forms of degree (p,q).

In mathematics, in the area of combinatorics and quantum calculus, the q-derivative, or Jackson derivative, is a q-analog of the ordinary derivative, introduced by Frank Hilton Jackson. It is the inverse of Jackson's q-integration. For other forms of q-derivative, see.

In mathematics, in the area of complex analysis, Nachbin's theorem is commonly used to establish a bound on the growth rates for an analytic function. This article provides a brief review of growth rates, including the idea of a function of exponential type. Classification of growth rates based on type help provide a finer tool than big O or Landau notation, since a number of theorems about the analytic structure of the bounded function and its integral transforms can be stated. In particular, Nachbin's theorem may be used to give the domain of convergence of the generalized Borel transform, given below.

In combinatorial mathematics, the q-difference polynomials or q-harmonic polynomials are a polynomial sequence defined in terms of the q-derivative. They are a generalized type of Brenke polynomial, and generalize the Appell polynomials. See also Sheffer sequence.

Polyharmonic splines are used for function approximation and data interpolation. They are very useful for interpolating and fitting scattered data in many dimensions. Special cases include thin plate splines and natural cubic splines in one dimension.

In many-body theory, the term Green's function is sometimes used interchangeably with correlation function, but refers specifically to correlators of field operators or creation and annihilation operators.

In mathematics, the multiplication theorem is a certain type of identity obeyed by many special functions related to the gamma function. For the explicit case of the gamma function, the identity is a product of values; thus the name. The various relations all stem from the same underlying principle; that is, the relation for one special function can be derived from that for the others, and is simply a manifestation of the same identity in different guises.

In thermal quantum field theory, the Matsubara frequency summation is the summation over discrete imaginary frequencies. It takes the following form

In mathematics, the oscillator representation is a projective unitary representation of the symplectic group, first investigated by Irving Segal, David Shale, and André Weil. A natural extension of the representation leads to a semigroup of contraction operators, introduced as the oscillator semigroup by Roger Howe in 1988. The semigroup had previously been studied by other mathematicians and physicists, most notably Felix Berezin in the 1960s. The simplest example in one dimension is given by SU(1,1). It acts as Möbius transformations on the extended complex plane, leaving the unit circle invariant. In that case the oscillator representation is a unitary representation of a double cover of SU(1,1) and the oscillator semigroup corresponds to a representation by contraction operators of the semigroup in SL(2,C) corresponding to Möbius transformations that take the unit disk into itself.

In mathematics, infinite compositions of analytic functions (ICAF) offer alternative formulations of analytic continued fractions, series, products and other infinite expansions, and the theory evolving from such compositions may shed light on the convergence/divergence of these expansions. Some functions can actually be expanded directly as infinite compositions. In addition, it is possible to use ICAF to evaluate solutions of fixed point equations involving infinite expansions. Complex dynamics offers another venue for iteration of systems of functions rather than a single function. For infinite compositions of a single function see Iterated function. For compositions of a finite number of functions, useful in fractal theory, see Iterated function system.

Functional regression is a version of regression analysis when responses or covariates include functional data. Functional regression models can be classified into four types depending on whether the responses or covariates are functional or scalar: (i) scalar responses with functional covariates, (ii) functional responses with scalar covariates, (iii) functional responses with functional covariates, and (iv) scalar or functional responses with functional and scalar covariates. In addition, functional regression models can be linear, partially linear, or nonlinear. In particular, functional polynomial models, functional single and multiple index models and functional additive models are three special cases of functional nonlinear models.

References