# White noise

Last updated
Colors of noise
White
Pink
Red (Brownian)
Grey

In signal processing, white noise is a random signal having equal intensity at different frequencies, giving it a constant power spectral density.  The term is used, with this or similar meanings, in many scientific and technical disciplines, including physics, acoustical engineering, telecommunications, and statistical forecasting. White noise refers to a statistical model for signals and signal sources, rather than to any specific signal. White noise draws its name from white light,  although light that appears white generally does not have a flat power spectral density over the visible band. Signal processing is a subfield of mathematics, information and electrical engineering that concerns the analysis, synthesis, and modification of signals, which are broadly defined as functions conveying "information about the behavior or attributes of some phenomenon", such as sound, images, and biological measurements. For example, signal processing techniques are used to improve signal transmission fidelity, storage efficiency, and subjective quality, and to emphasize or detect components of interest in a measured signal. Physics is the natural science that studies matter, its motion, and behavior through space and time, and that studies the related entities of energy and force. Physics is one of the most fundamental scientific disciplines, and its main goal is to understand how the universe behaves. Acoustical engineering is the branch of engineering dealing with sound and vibration. It is the application of acoustics, the science of sound and vibration, in technology. Acoustical engineers are typically concerned with the design, analysis and control of sound.

## Contents

In discrete time, white noise is a discrete signal whose samples are regarded as a sequence of serially uncorrelated random variables with zero mean and finite variance; a single realization of white noise is a random shock. Depending on the context, one may also require that the samples be independent and have identical probability distribution (in other words independent and identically distributed random variables are the simplest representation of white noise).  In particular, if each sample has a normal distribution with zero mean, the signal is said to be additive white Gaussian noise. In probability and statistics, a random variable, random quantity, aleatory variable, or stochastic variable is a variable whose possible values are outcomes of a random phenomenon. More specifically, a random variable is defined as a function that maps the outcomes of an unpredictable process to numerical quantities, typically real numbers. It is a variable, in the sense that it depends on the outcome of an underlying process providing the input to this function, and it is random in the sense that the underlying process is assumed to be random. In probability theory and statistics, variance is the expectation of the squared deviation of a random variable from its mean. Informally, it measures how far a set of (random) numbers are spread out from their average value. Variance has a central role in statistics, where some ideas that use it include descriptive statistics, statistical inference, hypothesis testing, goodness of fit, and Monte Carlo sampling. Variance is an important tool in the sciences, where statistical analysis of data is common. The variance is the square of the standard deviation, the second central moment of a distribution, and the covariance of the random variable with itself, and it is often represented by , , or .

In probability theory and statistics, a probability distribution is a mathematical function that provides the probabilities of occurrence of different possible outcomes in an experiment. In more technical terms, the probability distribution is a description of a random phenomenon in terms of the probabilities of events. For instance, if the random variable X is used to denote the outcome of a coin toss, then the probability distribution of X would take the value 0.5 for X = heads, and 0.5 for X = tails. Examples of random phenomena can include the results of an experiment or survey.

The samples of a white noise signal may be sequential in time, or arranged along one or more spatial dimensions. In digital image processing, the pixels of a white noise image are typically arranged in a rectangular grid, and are assumed to be independent random variables with uniform probability distribution over some interval. The concept can be defined also for signals spread over more complicated domains, such as a sphere or a torus.

In computer science, digital image processing is the use of computer algorithms to perform image processing on digital images. As a subcategory or field of digital signal processing, digital image processing has many advantages over analog image processing. It allows a much wider range of algorithms to be applied to the input data and can avoid problems such as the build-up of noise and signal distortion during processing. Since images are defined over two dimensions digital image processing may be modeled in the form of multidimensional systems. In digital imaging, a pixel, pel, or picture element is a physical point in a raster image, or the smallest addressable element in an all points addressable display device; so it is the smallest controllable element of a picture represented on the screen. A sphere is a perfectly round geometrical object in three-dimensional space that is the surface of a completely round ball.

An infinite-bandwidth white noise signal is a purely theoretical construction. The bandwidth of white noise is limited in practice by the mechanism of noise generation, by the transmission medium and by finite observation capabilities. Thus, random signals are considered "white noise" if they are observed to have a flat spectrum over the range of frequencies that are relevant to the context. For an audio signal, the relevant range is the band of audible sound frequencies (between 20 and 20,000 Hz). Such a signal is heard by the human ear as a hissing sound, resembling the /sh/ sound in "ash". In music and acoustics, the term "white noise" may be used for any signal that has a similar hissing sound.

An audio signal is a representation of sound, typically using a level of electrical voltage for analog signals, and a series of binary numbers for digital signals. Audio signals have frequencies in the audio frequency range of roughly 20 to 20,000 Hz, which corresponds to the lower and upper limits of human hearing. Audio signals may be synthesized directly, or may originate at a transducer such as a microphone, musical instrument pickup, phonograph cartridge, or tape head. Loudspeakers or headphones convert an electrical audio signal back into sound. The hertz (symbol: Hz) is the derived unit of frequency in the International System of Units (SI) and is defined as one cycle per second. It is named for Heinrich Rudolf Hertz, the first person to provide conclusive proof of the existence of electromagnetic waves. Hertz are commonly expressed in multiples: kilohertz (103 Hz, kHz), megahertz (106 Hz, MHz), gigahertz (109 Hz, GHz), terahertz (1012 Hz, THz), petahertz (1015 Hz, PHz), and exahertz (1018 Hz, EHz). Music is an art form and cultural activity whose medium is sound organized in time. General definitions of music include common elements such as pitch, rhythm, dynamics, and the sonic qualities of timbre and texture. Different styles or types of music may emphasize, de-emphasize or omit some of these elements. Music is performed with a vast range of instruments and vocal techniques ranging from singing to rapping; there are solely instrumental pieces, solely vocal pieces and pieces that combine singing and instruments. The word derives from Greek μουσική . See glossary of musical terminology.

The term white noise is sometimes used in the context of phylogenetically based statistical methods to refer to a lack of phylogenetic pattern in comparative data.  It is sometimes used analogously in nontechnical contexts to mean "random talk without meaningful contents".  

Phylogenetic comparative methods (PCMs) use information on the historical relationships of lineages (phylogenies) to test evolutionary hypotheses. The comparative method has a long history in evolutionary biology; indeed, Charles Darwin used differences and similarities between species as a major source of evidence in The Origin of Species. However, the fact that closely related lineages share many traits and trait combinations as a result of the process of descent with modification means that lineages are not independent. This realization inspired the development of explicitly phylogenetic comparative methods. Initially, these methods were primarily developed to control for phylogenetic history when testing for adaptation; however, in recent years the use of the term has broadened to include any use of phylogenies in statistical tests. Although most studies that employ PCMs focus on extant organisms, many methods can also be applied to extinct taxa and can incorporate information from the fossil record.

## Statistical properties Spectrogram of pink noise (left) and white noise (right), shown with linear frequency axis (vertical) versus time axis (horizontal).

Any distribution of values is possible (although it must have zero DC component). Even a binary signal which can only take on the values 1 or –1 will be white if the sequence is statistically uncorrelated. Noise having a continuous distribution, such as a normal distribution, can of course be white. In probability theory, the normaldistribution is a very common continuous probability distribution. Normal distributions are important in statistics and are often used in the natural and social sciences to represent real-valued random variables whose distributions are not known. A random variable with a Gaussian distribution is said to be normally distributed and is called a normal deviate.

It is often incorrectly assumed that Gaussian noise (i.e., noise with a Gaussian amplitude distribution see normal distribution) necessarily refers to white noise, yet neither property implies the other. Gaussianity refers to the probability distribution with respect to the value, in this context the probability of the signal falling within any particular range of amplitudes, while the term 'white' refers to the way the signal power is distributed (i.e., independently) over time or among frequencies.

White noise is the generalized mean-square derivative of the Wiener process or Brownian motion.

A generalization to random elements on infinite dimensional spaces, such as random fields, is the white noise measure.

## Practical applications

### Music

White noise is commonly used in the production of electronic music, usually either directly or as an input for a filter to create other types of noise signal. It is used extensively in audio synthesis, typically to recreate percussive instruments such as cymbals or snare drums which have high noise content in their frequency domain. A simple example of white noise is a nonexistent radio station (static).

### Electronics engineering

White noise is also used to obtain the impulse response of an electrical circuit, in particular of amplifiers and other audio equipment. It is not used for testing loudspeakers as its spectrum contains too great an amount of high frequency content. Pink noise, which differs from white noise in that it has equal energy in each octave, is used for testing transducers such as loudspeakers and microphones.

### Acoustics

To set up the equalization for a concert or other performance in a venue, a short burst of white or pink noise is sent through the PA system and monitored from various points in the venue so that the engineer can tell if the acoustics of the building naturally boost or cut any frequencies. The engineer can then adjust the overall equalization to ensure a balanced mix.

### Computing

White noise is used as the basis of some random number generators. For example, Random.org uses a system of atmospheric antennae to generate random digit patterns from white noise.

### Tinnitus treatment

White noise is a common synthetic noise source used for sound masking by a tinnitus masker.  White noise machines and other white noise sources are sold as privacy enhancers and sleep aids and to mask tinnitus.  Alternatively, the use of an FM radio tuned to unused frequencies ("static") is a simpler and more cost-effective source of white noise.  However, white noise generated from a common commercial radio receiver tuned to an unused frequency is extremely vulnerable to being contaminated with spurious signals, such as adjacent radio stations, harmonics from non-adjacent radio stations, electrical equipment in the vicinity of the receiving antenna causing interference, or even atmospheric events such as solar flares and especially lightning.

### Work environment

The effects of white noise upon cognitive function are mixed. Recently, a small study found that white noise background stimulation improves cognitive functioning among secondary students with attention deficit hyperactivity disorder (ADHD), while decreasing performance of non-ADHD students.   Other work indicates it is effective in improving the mood and performance of workers by masking background office noise,  but decreases cognitive performance in complex card sorting tasks. 

Similarly, an experiment was carried out on sixty six healthy participants to observe the benefits of using white noise in a learning environment. The experiment involved the participants identifying different images whilst having different sounds in the background. Overall the experiment showed that white noise does in fact have benefits in relation to learning. The experiments showed that white noise improved the participant's learning abilities and their recognition memory slightly. 

## Mathematical definitions

### White noise vector

A random vector (that is, a partially indeterminate process that produces vectors of real numbers) is said to be a white noise vector or white random vector if its components each have a probability distribution with zero mean and finite variance, and are statistically independent: that is, their joint probability distribution must be the product of the distributions of the individual components. 

A necessary (but, in general, not sufficient) condition for statistical independence of two variables is that they be statistically uncorrelated; that is, their covariance is zero. Therefore, the covariance matrix R of the components of a white noise vector w with n elements must be an n by n diagonal matrix, where each diagonal element Rᵢᵢ is the variance of component wᵢ; and the correlation matrix must be the n by n identity matrix.

In particular, if in addition to being independent every variable in w also has a normal distribution with zero mean and the same variance $\sigma ^{2}$ , w is said to be a Gaussian white noise vector. In that case, the joint distribution of w is a multivariate normal distribution; the independence between the variables then implies that the distribution has spherical symmetry in n-dimensional space. Therefore, any orthogonal transformation of the vector will result in a Gaussian white random vector. In particular, under most types of discrete Fourier transform, such as FFT and Hartley, the transform W of w will be a Gaussian white noise vector, too; that is, the n Fourier coefficients of w will be independent Gaussian variables with zero mean and the same variance $\sigma ^{2}$ .

The power spectrum P of a random vector w can be defined as the expected value of the squared modulus of each coefficient of its Fourier transform W, that is, Pᵢ = E(|Wᵢ|²). Under that definition, a Gaussian white noise vector will have a perfectly flat power spectrum, with Pᵢ = σ² for all i.

If w is a white random vector, but not a Gaussian one, its Fourier coefficients Wᵢ will not be completely independent of each other; although for large n and common probability distributions the dependencies are very subtle, and their pairwise correlations can be assumed to be zero.

Often the weaker condition "statistically uncorrelated" is used in the definition of white noise, instead of "statistically independent". However some of the commonly expected properties of white noise (such as flat power spectrum) may not hold for this weaker version. Under this assumption, the stricter version can be referred to explicitly as independent white noise vector.  :p.60 Other authors use strongly white and weakly white instead. 

An example of a random vector that is "Gaussian white noise" in the weak but not in the strong sense is x=[x₁,x₂] where x₁ is a normal random variable with zero mean, and x₂ is equal to +x₁ or to −x₁, with equal probability. These two variables are uncorrelated and individually normally distributed, but they are not jointly normally distributed and are not independent. If x is rotated by 45 degrees, its two components will still be uncorrelated, but their distribution will no longer be normal.

In some situations one may relax the definition by allowing each component of a white random vector w to have non-zero expected value $\mu$ . In image processing especially, where samples are typically restricted to positive values, one often takes $\mu$ to be one half of the maximum sample value. In that case, the Fourier coefficient W₀ corresponding to the zero-frequency component (essentially, the average of the w_i) will also have a non-zero expected value $\mu {\sqrt {n}}$ ; and the power spectrum P will be flat only over the non-zero frequencies.

### Discrete-time white noise

A discrete-time stochastic process $W[n]$ is a generalization of random vectors with a finite number of components to infinitely many components. A discrete-time stochastic process $W[n]$ is called white noise if its mean does not depend on the time $n$ and is equal to zero, i.e. $\operatorname {E} [W[n]]=0$ and if the autocorrelation function $R_{W}[n]=\operatorname {E} [W[k+n]W[k]]$ only depends on $n$ but not on $k$ and has a nonzero value only for $n=0$ , i.e. $R_{W}[n]=\sigma ^{2}\delta [n]$ .

### Continuous-time white noise

In order to define the notion of "white noise" in the theory of continuous-time signals, one must replace the concept of a "random vector" by a continuous-time random signal; that is, a random process that generates a function $w$ of a real-valued parameter $t$ .

Such a process is said to be white noise in the strongest sense if the value $w(t)$ for any time $t$ is a random variable that is statistically independent of its entire history before $t$ . A weaker definition requires independence only between the values $w(t_{1})$ and $w(t_{2})$ at every pair of distinct times $t_{1}$ and $t_{2}$ . An even weaker definition requires only that such pairs $w(t_{1})$ and $w(t_{2})$ be uncorrelated.  As in the discrete case, some authors adopt the weaker definition for "white noise", and use the qualifier independent to refer to either of the stronger definitions. Others use weakly white and strongly white to distinguish between them.

However, a precise definition of these concepts is not trivial, because some quantities that are finite sums in the finite discrete case must be replaced by integrals that may not converge. Indeed, the set of all possible instances of a signal $w$ is no longer a finite-dimensional space $\mathbb {R} ^{n}$ , but an infinite-dimensional function space. Moreover, by any definition a white noise signal $w$ would have to be essentially discontinuous at every point; therefore even the simplest operations on $w$ , like integration over a finite interval, require advanced mathematical machinery.

Some authors require each value $w(t)$ to be a real-valued random variable with expectation $\mu$ and some finite variance $\sigma ^{2}$ . Then the covariance $\mathrm {E} (w(t_{1})\cdot w(t_{2}))$ between the values at two times $t_{1}$ and $t_{2}$ is well-defined: it is zero if the times are distinct, and $\sigma ^{2}$ if they are equal. However, by this definition, the integral

$W_{[a,a+r]}=\int _{a}^{a+r}w(t)\,dt$ over any interval with positive width $r$ would be simply the width times the expectation: $r\mu$ . This property would render the concept inadequate as a model of physical "white noise" signals.

Therefore, most authors define the signal $w$ indirectly by specifying non-zero values for the integrals of $w(t)$ and $|w(t)|^{2}$ over any interval $[a,a+r]$ , as a function of its width $r$ . In this approach, however, the value of $w(t)$ at an isolated time cannot be defined as a real-valued random variable[ citation needed ]. Also the covariance $\mathrm {E} (w(t_{1})\cdot w(t_{2}))$ becomes infinite when $t_{1}=t_{2}$ ; and the autocorrelation function $\mathrm {R} (t_{1},t_{2})$ must be defined as $N\delta (t_{1}-t_{2})$ , where $N$ is some real constant and $\delta$ In this approach, one usually specifies that the integral $W_{I}$ of $w(t)$ over an interval $I=[a,b]$ is a real random variable with normal distribution, zero mean, and variance $(b-a)\sigma ^{2}$ ; and also that the covariance $\mathrm {E} (W_{I}\cdot W_{J})$ of the integrals $W_{I}$ , $W_{J}$ is $r\sigma ^{2}$ , where $r$ is the width of the intersection $I\cap J$ of the two intervals $I,J$ . This model is called a Gaussian white noise signal (or process).

## Mathematical applications

### Time series analysis and regression

In statistics and econometrics one often assumes that an observed series of data values is the sum of a series of values generated by a deterministic linear process, depending on certain independent (explanatory) variables, and on a series of random noise values. Then regression analysis is used to infer the parameters of the model process from the observed data, e.g. by ordinary least squares, and to test the null hypothesis that each of the parameters is zero against the alternative hypothesis that it is non-zero. Hypothesis testing typically assumes that the noise values are mutually uncorrelated with zero mean and have the same Gaussian probability distribution in other words, that the noise is white. If there is non-zero correlation between the noise values underlying different observations then the estimated model parameters are still unbiased, but estimates of their uncertainties (such as confidence intervals) will be biased (not accurate on average). This is also true if the noise is heteroskedastic  that is, if it has different variances for different data points.

Alternatively, in the subset of regression analysis known as time series analysis there are often no explanatory variables other than the past values of the variable being modeled (the dependent variable). In this case the noise process is often modeled as a moving average process, in which the current value of the dependent variable depends on current and past values of a sequential white noise process.

### Random vector transformations

These two ideas are crucial in applications such as channel estimation and channel equalization in communications and audio. These concepts are also used in data compression.

In particular, by a suitable linear transformation (a coloring transformation), a white random vector can be used to produce a "non-white" random vector (that is, a list of random variables) whose elements have a prescribed covariance matrix. Conversely, a random vector with known covariance matrix can be transformed into a white random vector by a suitable whitening transformation.

## Generation

White noise may be generated digitally with a digital signal processor, microprocessor, or microcontroller. Generating white noise typically entails feeding an appropriate stream of random numbers to a digital-to-analog converter. The quality of the white noise will depend on the quality of the algorithm used. 

## Related Research Articles

In probability theory, the central limit theorem (CLT) establishes that, in some situations, when independent random variables are added, their properly normalized sum tends toward a normal distribution even if the original variables themselves are not normally distributed. The theorem is a key concept in probability theory because it implies that probabilistic and statistical methods that work for normal distributions can be applicable to many problems involving other types of distributions. In probability theory and statistics, the multivariate normal distribution, multivariate Gaussian distribution, or joint normal distribution is a generalization of the one-dimensional (univariate) normal distribution to higher dimensions. One definition is that a random vector is said to be k-variate normally distributed if every linear combination of its k components has a univariate normal distribution. Its importance derives mainly from the multivariate central limit theorem. The multivariate normal distribution is often used to describe, at least approximately, any set of (possibly) correlated real-valued random variables each of which clusters around a mean value. Principal component analysis (PCA) is a statistical procedure that uses an orthogonal transformation to convert a set of observations of possibly correlated variables into a set of values of linearly uncorrelated variables called principal components. If there are observations with variables, then the number of distinct principal components is . This transformation is defined in such a way that the first principal component has the largest possible variance, and each succeeding component in turn has the highest variance possible under the constraint that it is orthogonal to the preceding components. The resulting vectors are an uncorrelated orthogonal basis set. PCA is sensitive to the relative scaling of the original variables.

In probability theory and statistics, covariance is a measure of the joint variability of two random variables. If the greater values of one variable mainly correspond with the greater values of the other variable, and the same holds for the lesser values,, the covariance is positive. In the opposite case, when the greater values of one variable mainly correspond to the lesser values of the other,, the covariance is negative. The sign of the covariance therefore shows the tendency in the linear relationship between the variables. The magnitude of the covariance is not easy to interpret because it is not normalized and hence depends on the magnitudes of the variables. The normalized version of the covariance, the correlation coefficient, however, shows by its magnitude the strength of the linear relation.

In probability theory and statistics, two real-valued random variables, , , are said to be uncorrelated if their covariance, , is zero. If two variables are uncorrelated, there is no linear relationship between them. In probability theory and statistics, a covariance matrix, also known as auto-covariance matrix, dispersion matrix, variance matrix, or variance–covariance matrix, is a matrix whose element in the i, j position is the covariance between the i-th and j-th elements of a random vector. A random vector is a random variable with multiple dimensions. Each element of the vector is a scalar random variable. Each element has either a finite number of observed empirical values or a finite or infinite number of potential values. The potential values are specified by a theoretical joint probability distribution.

Additive white Gaussian noise (AWGN) is a basic noise model used in Information theory to mimic the effect of many random processes that occur in nature. The modifiers denote specific characteristics: In probability theory and statistics, the Rayleigh distribution is a continuous probability distribution for nonnegative-valued random variables. It is essentially a chi distribution with two degrees of freedom.

In probability theory and statistics, a Gaussian process is a stochastic process, such that every finite collection of those random variables has a multivariate normal distribution, i.e. every finite linear combination of them is normally distributed. The distribution of a Gaussian process is the joint distribution of all those random variables, and as such, it is a distribution over functions with a continuous domain, e.g. time or space.

In mathematics, a moment is a specific quantitative measure of the shape of a function. It is used in both mechanics and statistics. If the function represents physical density, then the zeroth moment is the total mass, the first moment divided by the total mass is the center of mass, and the second moment is the rotational inertia. If the function is a probability distribution, then the zeroth moment is the total probability, the first moment is the mean, the second central moment is the variance, the third standardized moment is the skewness, and the fourth standardized moment is the kurtosis. The mathematical concept is closely related to the concept of moment in physics. In signal processing, independent component analysis (ICA) is a computational method for separating a multivariate signal into additive subcomponents. This is done by assuming that the subcomponents are non-Gaussian signals and that they are statistically independent from each other. ICA is a special case of blind source separation. A common example application is the "cocktail party problem" of listening in on one person's speech in a noisy room.

In the theory of stochastic processes, the Karhunen–Loève theorem, also known as the Kosambi–Karhunen–Loève theorem is a representation of a stochastic process as an infinite linear combination of orthogonal functions, analogous to a Fourier series representation of a function on a bounded interval. The transformation is also known as Hotelling transform and eigenvector transform, and is closely related to principal component analysis (PCA) technique widely used in image processing and in data analysis in many fields.

In probability theory, fractional Brownian motion (fBm), also called a fractal Brownian motion, is a generalization of Brownian motion. Unlike classical Brownian motion, the increments of fBm need not be independent. fBm is a continuous-time Gaussian process BH(t) on [0, T], which starts at zero, has expectation zero for all t in [0, T], and has the following covariance function:

In statistics and signal processing, a minimum mean square error (MMSE) estimator is an estimation method which minimizes the mean square error (MSE), which is a common measure of estimator quality, of the fitted values of a dependent variable. In the Bayesian setting, the term MMSE more specifically refers to estimation with quadratic loss function. In such case, the MMSE estimator is given by the posterior mean of the parameter to be estimated. Since the posterior mean is cumbersome to calculate, the form of the MMSE estimator is usually constrained to be within a certain class of functions. Linear MMSE estimators are a popular choice since they are easy to use, easy to calculate, and very versatile. It has given rise to many popular estimators such as the Wiener–Kolmogorov filter and Kalman filter.

A whitening transformation or sphering transformation is a linear transformation that transforms a vector of random variables with a known covariance matrix into a set of new variables whose covariance is the identity matrix, meaning that they are uncorrelated and each have variance 1. The transformation is called "whitening" because it changes the input vector into a white noise vector.

In statistical signal processing, the goal of spectral density estimation (SDE) is to estimate the spectral density of a random signal from a sequence of time samples of the signal. Intuitively speaking, the spectral density characterizes the frequency content of the signal. One purpose of estimating the spectral density is to detect any periodicities in the data, by observing peaks at the frequencies corresponding to these periodicities.

In probability theory and statistics, the specific name generalized chi-squared distribution arises in relation to one particular family of variants of the chi-squared distribution. There are several other such variants for which the same term is sometimes used, or which clearly are generalizations of the chi-squared distribution, and which are treated elsewhere: some are special cases of the family discussed here, for example the noncentral chi-squared distribution and the gamma distribution, while the generalized gamma distribution is outside this family. The type of generalisation of the chi-squared distribution that is discussed here is of importance because it arises in the context of the distribution of statistical estimates in cases where the usual statistical theory does not hold. For example, if a predictive model is fitted by least squares but the model errors have either autocorrelation or heteroscedasticity, then a statistical analysis of alternative model structures can be undertaken by relating changes in the sum of squares to an asymptotically valid generalized chi-squared distribution. More specifically, the distribution can be defined in terms of a quadratic form derived from a multivariate normal distribution.

In probability and statistics, an elliptical distribution is any member of a broad family of probability distributions that generalize the multivariate normal distribution. Intuitively, in the simplified two and three dimensional case, the joint distribution forms an ellipse and an ellipsoid, respectively, in iso-density plots.

In probability theory and statistics, complex random variables are a generalization of real-valued random variables to complex numbers, i.e. the possible values a complex random variable may take are complex numbers. Complex random variables can always be considered as pairs of real random variables: their real and imaginary parts. Therefore, the distribution of one complex random variable may be interpreted as the joint distribution of two real random variables.

1. Carter,Mancini, Bruce,Ron (2009). Op Amps for Everyone. Texas Instruments. pp. 10–11. ISBN   978-0080949482.
2. Stein, Michael L. (1999). Interpolation of Spatial Data: Some Theory for Kriging. Springer Series in Statistics. Springer. p. 40. doi:10.1007/978-1-4612-1494-6. ISBN   978-1-4612-7166-6. white light is approximately an equal mixture of all visible frequencies of light, which was demonstrated by Isaac Newton
3. Stein, Michael L. (1999). Interpolation of Spatial Data: Some Theory for Kriging. Springer Series in Statistics. Springer. p. 40. doi:10.1007/978-1-4612-1494-6. ISBN   978-1-4612-7166-6. The best-known generalized process is white noise, which can be thought of as a continuous time analogue to a sequence of independent and identically distributed observations.
4. Diebold, Frank (2007). Elements of Forecasting (Fourth ed.).
5. Fusco, G; Garland, T., Jr; Hunt, G; Hughes, NC (2011). "Developmental trait evolution in trilobites" (PDF). Evolution. 66 (2): 314–329. doi:10.1111/j.1558-5646.2011.01447.x. PMID   22276531.
6. Claire Shipman (2005), Good Morning America : "The political rhetoric on Social Security is white noise. Said on ABC's Good Morning America TV show, January 11, 2005.
7. Don DeLillo (1985), White Noise
8. Jastreboff, P. J. (2000). "Tinnitus Habituation Therapy (THT) and Tinnitus Retraining Therapy (TRT)". Tinnitus Handbook. San Diego: Singular. pp. 357–376.
9. López, HH; Bracha, AS; Bracha, HS (September 2002). "Evidence based complementary intervention for insomnia" (PDF). Hawaii Med J. 61 (9): 192, 213. PMID   12422383.
10. Noell, Courtney A; William L Meyerhoff (February 2003). "Tinnitus. Diagnosis and treatment of this elusive symptom". Geriatrics. 58 (2): 28–34. ISSN   0016-867X. PMID   12596495.
11. Soderlund, Goran; Sverker Sikstrom; Jan Loftesnes; Edmund Sonuga Barke (2010). "The effects of background white noise on memory performance in inattentive school children". Behavioral and Brain Functions. 6 (1): 55. doi:10.1186/1744-9081-6-55. PMC  . PMID   20920224.
12. Söderlund, Göran; Sverker Sikström; Andrew Smart (2007). "Listen to the noise: Noise is beneficial for cognitive performance in ADHD". Journal of Child Psychology and Psychiatry. 48 (8): 840–847. CiteSeerX  . doi:10.1111/j.1469-7610.2007.01749.x. ISSN   0021-9630. PMID   17683456.
13. Loewen, Laura J.; Peter Suedfeld (1992-05-01). "Cognitive and Arousal Effects of Masking Office Noise". Environment and Behavior. 24 (3): 381–395. doi:10.1177/0013916592243006 . Retrieved 2011-10-28.
14. Baker, Mary Anne; Dennis H. Holding (July 1993). "The effects of noise and speech on cognitive task performance". Journal of General Psychology. 120 (3): 339–355. doi:10.1080/00221309.1993.9711152. ISSN   0022-1309. PMID   8138798.
15. Rausch, V. H. (2014). White noise improves learning by modulating activity in dopaminergic midbrain regions and right superior temporal sulcus . Journal of cognitive neuroscience , 1469-1480
16. Jeffrey A. Fessler (1998), On Transformations of Random Vectors. Technical report 314, Dept. of Electrical Engineering and Computer Science, Univ. of Michigan.
17. Eric Zivot and Jiahui Wang (2006), Modeling Financial Time Series with S-PLUS. Second Edition.
18. Francis X. Diebold (2007), Elements of Forecasting, 4th edition.
19. White noise process. By Econterms via About.com. Accessed on 2013-02-12.
20. Matt Donadio. "How to Generate White Gaussian Noise" . Retrieved 2012-09-19.