This article includes a list of general references, but it lacks sufficient corresponding inline citations .(January 2019) |
In mathematics and statistical analysis, bicoherence (also known as bispectral coherency) is a squared normalised version of the bispectrum. The bicoherence takes values bounded between 0 and 1, which make it a convenient measure for quantifying the extent of phase coupling in a signal. The prefix bi- in bispectrum and bicoherence refers not to two time series xt, yt but rather to two frequencies of a single signal.
The bispectrum is a statistic used to search for nonlinear interactions. The Fourier transform of the second-order cumulant, i.e., the autocorrelation function, is the traditional power spectrum. The Fourier transform of C3(t1,t2) (third-order cumulant) is called bispectrum or bispectral density. They fall in the category of Higher Order Spectra, or Polyspectra and provide supplementary information to the power spectrum. The third order polyspectrum (bispectrum) is the easiest to compute, and hence the most popular.
The difference with measuring coherence (coherence analysis is an extensively used method to study the correlations in frequency domain, between two simultaneously measured signals) is the need for both input and output measurements by estimating two auto-spectra and one cross spectrum. On the other hand, bicoherence is an auto-quantity, i.e. it can be computed from a single signal. The coherence function provides a quantification of deviations from linearity in the system which lies between the input and output measurement sensors. The bicoherence measures the proportion of the signal energy at any bifrequency that is quadratically phase coupled. It is usually normalized in the range similar to correlation coefficient and classical (second order) coherence. It was also used for depth of anasthesia assessment and widely in plasma physics (nonlinear energy transfer) and also for detection of gravitational waves.
Bispectrum and bicoherence may be applied to the case of non-linear interactions of a continuous spectrum of propagating waves in one dimension. [1]
Bicoherence measurements have been carried out for EEG signals monitoring in sleep, wakefulness and seizures.[ citation needed ]
The bispectrum is defined as the triple product
where is the bispectrum evaluated at frequencies and , is the Fourier transform of the signal, and denotes the complex conjugate. The Fourier transform is a complex quantity, and so is the bispectrum. From complex multiplication, the magnitude of the bispectrum is equal to the product of the magnitudes of each of the frequency components, and the phase of the bispectrum is the sum of the phases of each of the frequency components.
Suppose that the three Fourier components , and were perfectly phase locked. Then if the Fourier transform was calculated several times from different parts of the time series, the bispectrum will always have the same value. If we add together all of the bispectra, they will sum without cancelling. On the other hand, suppose that the phases of each of these frequencies was random. Then, the bispectrum will have the same magnitude (assuming that the magnitude of the frequency components is the same) but the phase will be randomly oriented. Adding together all of the bispectra will result in cancellation, because of the random phase orientation, and so the sum of the bispectra will have a small magnitude. Detecting phase coupling requires summation over a number of independent samples- this is the first motivation for defining the bicoherence. Secondly, the bispectrum is not normalized, because it still depends on the magnitudes of each of the frequency components. The bicoherence includes a normalization factor that removes the magnitude dependence.
There is some inconsistency with the definition of the bicoherence normalization constant. Some of the definitions that have been used are
which was provided in Sigl and Chamoun 1994, but does not appear to be correctly normalized. Alternatively, plasma physics typically uses
where the angle brackets denote averaging. Note that this is the same as using a sum, because is the same in the numerator and the denominator. This definition is directly from Nagashima 2006, and is also referred to in He 2009 and Maccarone 2005.
Finally, one of the most intuitive definitions comes from Hagihira 2001 and Hayashi 2007, which is
The numerator contains the magnitude of the bispectrum summed over all of the time series segments. This quantity is large if there is phase coupling, and approaches 0 in the limit of random phases. The denominator, which normalizes the bispectrum, is given by calculating the bispectrum after setting all of the phases to 0. This corresponds to the case where there is perfect phase coupling, because all of the samples have zero phase. Therefore, the bicoherence has a value between 0 (random phases) and 1 (total phase coupling).
Among the three normalizations above, the second one can be interpreted as a correlation coefficient defined between energy-supplying and energy-receiving parties in a second order nonlinear interaction, whereas the bispectrum has been proven to be the corresponding covariance. [2] Therefore, just as correlation cannot sufficiently demonstrate the presence of causality, a significant bicoherence peak also cannot sufficiently substantiate the existence of a nonlinear interaction.
Quantum decoherence is the loss of quantum coherence, the process in which a system's behaviour changes from that which can be explained by quantum mechanics to that which can be explained by classical mechanics. In quantum mechanics, particles such as electrons are described by a wave function, a mathematical representation of the quantum state of a system; a probabilistic interpretation of the wave function is used to explain various quantum effects. As long as there exists a definite phase relation between different states, the system is said to be coherent. A definite phase relationship is necessary to perform quantum computing on quantum information encoded in quantum states. Coherence is preserved under the laws of quantum physics.
The power spectrum of a time series describes the distribution of power into frequency components composing that signal. According to Fourier analysis, any physical signal can be decomposed into a number of discrete frequencies, or a spectrum of frequencies over a continuous range. The statistical average of any sort of signal as analyzed in terms of its frequency content, is called its spectrum.
Spectral methods are a class of techniques used in applied mathematics and scientific computing to numerically solve certain differential equations. The idea is to write the solution of the differential equation as a sum of certain "basis functions" and then to choose the coefficients in the sum in order to satisfy the differential equation as well as possible.
In physics, specifically in quantum mechanics, a coherent state is the specific quantum state of the quantum harmonic oscillator, often described as a state that has dynamics most closely resembling the oscillatory behavior of a classical harmonic oscillator. It was the first example of quantum dynamics when Erwin Schrödinger derived it in 1926, while searching for solutions of the Schrödinger equation that satisfy the correspondence principle. The quantum harmonic oscillator arise in the quantum theory of a wide range of physical systems. For instance, a coherent state describes the oscillating motion of a particle confined in a quadratic potential well. The coherent state describes a state in a system for which the ground-state wavepacket is displaced from the origin of the system. This state can be related to classical solutions by a particle oscillating with an amplitude equivalent to the displacement.
The Ising model, named after the physicists Ernst Ising and Wilhelm Lenz, is a mathematical model of ferromagnetism in statistical mechanics. The model consists of discrete variables that represent magnetic dipole moments of atomic "spins" that can be in one of two states. The spins are arranged in a graph, usually a lattice, allowing each spin to interact with its neighbors. Neighboring spins that agree have a lower energy than those that disagree; the system tends to the lowest energy but heat disturbs this tendency, thus creating the possibility of different structural phases. The model allows the identification of phase transitions as a simplified model of reality. The two-dimensional square-lattice Ising model is one of the simplest statistical models to show a phase transition.
In mathematical analysis, Parseval's identity, named after Marc-Antoine Parseval, is a fundamental result on the summability of the Fourier series of a function. The identity asserts the equality of the energy of a periodic signal and the energy of its frequency domain representation. Geometrically, it is a generalized Pythagorean theorem for inner-product spaces.
Bandlimiting refers to a process which reduces the energy of a signal to an acceptably low level outside of a desired frequency range.
In quantum field theory, correlation functions, often referred to as correlators or Green's functions, are vacuum expectation values of time-ordered products of field operators. They are a key object of study in quantum field theory where they can be used to calculate various observables such as S-matrix elements. They are closely related to correlation functions between random variables, although they are nonetheless different objects, being defined in Minkowski spacetime and on quantum operators.
In signal processing, cross-correlation is a measure of similarity of two series as a function of the displacement of one relative to the other. This is also known as a sliding dot product or sliding inner-product. It is commonly used for searching a long signal for a shorter, known feature. It has applications in pattern recognition, single particle analysis, electron tomography, averaging, cryptanalysis, and neurophysiology. The cross-correlation is similar in nature to the convolution of two functions. In an autocorrelation, which is the cross-correlation of a signal with itself, there will always be a peak at a lag of zero, and its size will be the signal energy.
In quantum optics, correlation functions are used to characterize the statistical and coherence properties of an electromagnetic field. The degree of coherence is the normalized correlation of electric fields; in its simplest form, termed . It is useful for quantifying the coherence between two electric fields, as measured in a Michelson or other linear optical interferometer. The correlation between pairs of fields, , typically is used to find the statistical character of intensity fluctuations. First order correlation is actually the amplitude-amplitude correlation and the second order correlation is the intensity-intensity correlation. It is also used to differentiate between states of light that require a quantum mechanical description and those for which classical fields are sufficient. Analogous considerations apply to any Bose field in subatomic physics, in particular to mesons.
In mathematics, in the area of statistical analysis, the bispectrum is a statistic used to search for nonlinear interactions.
Quantum noise is noise arising from the indeterminate state of matter in accordance with fundamental principles of quantum mechanics, specifically the uncertainty principle and via zero-point energy fluctuations. Quantum noise is due to the apparently discrete nature of the small quantum constituents such as electrons, as well as the discrete nature of quantum effects, such as photocurrents.
In mathematics, Fourier–Bessel series is a particular kind of generalized Fourier series based on Bessel functions.
In signal processing, the coherence is a statistic that can be used to examine the relation between two signals or data sets. It is commonly used to estimate the power transfer between input and output of a linear system. If the signals are ergodic, and the system function is linear, it can be used to estimate the causality between the input and output.
In statistical mechanics, the Griffiths inequality, sometimes also called Griffiths–Kelly–Sherman inequality or GKS inequality, named after Robert B. Griffiths, is a correlation inequality for ferromagnetic spin systems. Informally, it says that in ferromagnetic spin systems, if the 'a-priori distribution' of the spin is invariant under spin flipping, the correlation of any monomial of the spins is non-negative; and the two point correlation of two monomial of the spins is non-negative.
Quantitative electroencephalography is a field concerned with the numerical analysis of electroencephalography (EEG) data and associated behavioral correlates.
Brain connectivity estimators represent patterns of links in the brain. Connectivity can be considered at different levels of the brain's organisation: from neurons, to neural assemblies and brain structures. Brain connectivity involves different concepts such as: neuroanatomical or structural connectivity, functional connectivity and effective connectivity.
The cluster-expansion approach is a technique in quantum mechanics that systematically truncates the BBGKY hierarchy problem that arises when quantum dynamics of interacting systems is solved. This method is well suited for producing a closed set of numerically computable equations that can be applied to analyze a great variety of many-body and/or quantum-optical problems. For example, it is widely applied in semiconductor quantum optics and it can be applied to generalize the semiconductor Bloch equations and semiconductor luminescence equations.
Multi-spectral phase coherence (MSPC) is a generalized cross-frequency coupling metric introduced by Yang and colleagues in 2016. MSPC can be used to quantify nonlinear phase coupling between a set of base frequencies and their harmonic/intermodulation frequencies. MSPC is a model-free method, which can provide a system description, including (i) the order of the nonlinearity, (ii) the direction of interaction, (iii) the time delay in the system, and both (iv) harmonic and (v) intermodulation coupling.
In mathematics, the graph Fourier transform is a mathematical transform which eigendecomposes the Laplacian matrix of a graph into eigenvalues and eigenvectors. Analogously to the classical Fourier transform, the eigenvalues represent frequencies and eigenvectors form what is known as a graph Fourier basis.