This article includes a list of general references, but it lacks sufficient corresponding inline citations .(December 2023) |
The triple correlation of an ordinary function on the real line is the integral of the product of that function with two independently shifted copies of itself:
The Fourier transform of triple correlation is the bispectrum. The triple correlation extends the concept of autocorrelation, which correlates a function with a single shifted copy of itself and thereby enhances its latent periodicities.
The theory of the triple correlation was first investigated by statisticians examining the cumulant structure of non-Gaussian random processes. It was also independently studied by physicists as a tool for spectroscopy of laser beams. Hideya Gamo in 1963 described an apparatus for measuring the triple correlation of a laser beam, and also showed how phase information can be recovered from the real part of the bispectrum—up to sign reversal and linear offset. However, Gamo's method implicitly requires the Fourier transform to never be zero at any frequency. This requirement was relaxed, and the class of functions which are known to be uniquely identified by their triple (and higher-order) correlations was considerably expanded, by the study of Yellott and Iverson (1992). Yellott & Iverson also pointed out the connection between triple correlations and the visual texture discrimination theory proposed by Bela Julesz.
Triple correlation methods are frequently used in signal processing for treating signals that are corrupted by additive white Gaussian noise; in particular, triple correlation techniques are suitable when multiple observations of the signal are available and the signal may be translating in between the observations, e.g., a sequence of images of an object translating on a noisy background. What makes the triple correlation particularly useful for such tasks are three properties: (1) it is invariant under translation of the underlying signal; (2) it is unbiased in additive Gaussian noise; and (3) it retains nearly all of the relevant phase information in the underlying signal. Properties (1)-(3) of the triple correlation extend in many cases to functions on an arbitrary locally compact group, in particular to the groups of rotations and rigid motions of euclidean space that arise in computer vision and signal processing.
The triple correlation may be defined for any locally compact group by using the group's left-invariant Haar measure. It is easily shown that the resulting object is invariant under left translation of the underlying function and unbiased in additive Gaussian noise. What is more interesting is the question of uniqueness : when two functions have the same triple correlation, how are the functions related? For many cases of practical interest, the triple correlation of a function on an abstract group uniquely identifies that function up to a single unknown group action. This uniqueness is a mathematical result that relies on the Pontryagin duality theorem, the Tannaka–Krein duality theorem, and related results of Iwahori-Sugiura, and Tatsuuma. Algorithms exist for recovering bandlimited functions from their triple correlation on Euclidean space, as well as rotation groups in two and three dimensions. There is also an interesting link with Wiener's tauberian theorem: any function whose translates are dense in , where is a locally compact Abelian group, is also uniquely identified by its triple correlation.
Autocorrelation, sometimes known as serial correlation in the discrete time case, is the correlation of a signal with a delayed copy of itself as a function of delay. Informally, it is the similarity between observations of a random variable as a function of the time lag between them. The analysis of autocorrelation is a mathematical tool for finding repeating patterns, such as the presence of a periodic signal obscured by noise, or identifying the missing fundamental frequency in a signal implied by its harmonic frequencies. It is often used in signal processing for analyzing functions or series of values, such as time domain signals.
In mathematics, convolution is a mathematical operation on two functions that produces a third function. The term convolution refers to both the result function and to the process of computing it. It is defined as the integral of the product of the two functions after one is reflected about the y-axis and shifted. The integral is evaluated for all values of shift, producing the convolution function. The choice of which function is reflected and shifted before the integral does not change the integral result. Graphically, it expresses how the 'shape' of one function is modified by the other.
In mathematics, many sets of transformations form a group under function composition; for example, the rotations around a point in the plane. It is often useful to consider the group as an abstract group, and to say that one has a group action of the abstract group that consists of performing the transformations of the group of transformations. The reason for distinguishing the group from the transformations is that, generally, a group of transformations of a structure acts also on various related structures; for example, the above rotation group acts also on triangles by transforming triangles into triangles.
In mathematical analysis, the Haar measure assigns an "invariant volume" to subsets of locally compact topological groups, consequently defining an integral for functions on those groups.
In mathematics, topological groups are the combination of groups and topological spaces, i.e. they are groups and topological spaces at the same time, such that the continuity condition for the group operations connects these two structures together and consequently they are not independent from each other.
In signal processing, white noise is a random signal having equal intensity at different frequencies, giving it a constant power spectral density. The term is used with this or similar meanings in many scientific and technical disciplines, including physics, acoustical engineering, telecommunications, and statistical forecasting. White noise refers to a statistical model for signals and signal sources, rather than to any specific signal. White noise draws its name from white light, although light that appears white generally does not have a flat power spectral density over the visible band.
In signal processing, the power spectrum of a continuous time signal describes the distribution of power into frequency components composing that signal. According to Fourier analysis, any physical signal can be decomposed into a number of discrete frequencies, or a spectrum of frequencies over a continuous range. The statistical average of any sort of signal as analyzed in terms of its frequency content, is called its spectrum.
In functional analysis and related areas of mathematics, Fréchet spaces, named after Maurice Fréchet, are special topological vector spaces. They are generalizations of Banach spaces. All Banach and Hilbert spaces are Fréchet spaces. Spaces of infinitely differentiable functions are typical examples of Fréchet spaces, many of which are typically not Banach spaces.
In mathematics, an amenable group is a locally compact topological group G carrying a kind of averaging operation on bounded functions that is invariant under translation by group elements. The original definition, in terms of a finitely additive measure on subsets of G, was introduced by John von Neumann in 1929 under the German name "messbar" in response to the Banach–Tarski paradox. In 1949 Mahlon M. Day introduced the English translation "amenable", apparently as a pun on "mean".
In physics, mathematics and statistics, scale invariance is a feature of objects or laws that do not change if scales of length, energy, or other variables, are multiplied by a common factor, and thus represent a universality.
In signal processing, cross-correlation is a measure of similarity of two series as a function of the displacement of one relative to the other. This is also known as a sliding dot product or sliding inner-product. It is commonly used for searching a long signal for a shorter, known feature. It has applications in pattern recognition, single particle analysis, electron tomography, averaging, cryptanalysis, and neurophysiology. The cross-correlation is similar in nature to the convolution of two functions. In an autocorrelation, which is the cross-correlation of a signal with itself, there will always be a peak at a lag of zero, and its size will be the signal energy.
In mathematics, the spectrum of a C*-algebra or dual of a C*-algebraA, denoted Â, is the set of unitary equivalence classes of irreducible *-representations of A. A *-representation π of A on a Hilbert space H is irreducible if, and only if, there is no closed subspace K different from H and {0} which is invariant under all operators π(x) with x ∈ A. We implicitly assume that irreducible representation means non-null irreducible representation, thus excluding trivial (i.e. identically 0) representations on one-dimensional spaces. As explained below, the spectrum  is also naturally a topological space; this is similar to the notion of the spectrum of a ring.
Phase correlation is an approach to estimate the relative translative offset between two similar images or other data sets. It is commonly used in image registration and relies on a frequency-domain representation of the data, usually calculated by fast Fourier transforms. The term is applied particularly to a subset of cross-correlation techniques that isolate the phase information from the Fourier-space representation of the cross-correlogram.
In mathematics, in the area of statistical analysis, the bispectrum is a statistic used to search for nonlinear interactions.
A cyclostationary process is a signal having statistical properties that vary cyclically with time. A cyclostationary process can be viewed as multiple interleaved stationary processes. For example, the maximum daily temperature in New York City can be modeled as a cyclostationary process: the maximum temperature on July 21 is statistically different from the temperature on December 20; however, it is a reasonable approximation that the temperature on December 20 of different years has identical statistics. Thus, we can view the random process composed of daily maximum temperatures as 365 interleaved stationary processes, each of which takes on a new value once per year.
In applied mathematics, the Wiener–Khinchin theorem or Wiener–Khintchine theorem, also known as the Wiener–Khinchin–Einstein theorem or the Khinchin–Kolmogorov theorem, states that the autocorrelation function of a wide-sense-stationary random process has a spectral decomposition given by the power spectral density of that process.
In mathematics, the Cameron–Martin theorem or Cameron–Martin formula is a theorem of measure theory that describes how abstract Wiener measure changes under translation by certain elements of the Cameron–Martin Hilbert space.
An infinite-dimensional Lebesgue measure is a measure defined on an infinite-dimensional Banach space, which shares certain properties with the Lebesgue measure defined on finite-dimensional spaces.
In probability and statistics, the Tweedie distributions are a family of probability distributions which include the purely continuous normal, gamma and inverse Gaussian distributions, the purely discrete scaled Poisson distribution, and the class of compound Poisson–gamma distributions which have positive mass at zero, but are otherwise continuous. Tweedie distributions are a special case of exponential dispersion models and are often used as distributions for generalized linear models.