Anscombe transform

Last updated
Standard deviation of the transformed Poisson random variable as a function of the mean
m
{\displaystyle m}
. Anscombe stabilized stdev.svg
Standard deviation of the transformed Poisson random variable as a function of the mean .

In statistics, the Anscombe transform, named after Francis Anscombe, is a variance-stabilizing transformation that transforms a random variable with a Poisson distribution into one with an approximately standard Gaussian distribution. The Anscombe transform is widely used in photon-limited imaging (astronomy, X-ray) where images naturally follow the Poisson law. The Anscombe transform is usually used to pre-process the data in order to make the standard deviation approximately constant. Then denoising algorithms designed for the framework of additive white Gaussian noise are used; the final estimate is then obtained by applying an inverse Anscombe transformation to the denoised data.

Contents

Anscombe transform animated. Here
m
{\displaystyle \mu }
is the mean of the Anscombe-transformed Poisson distribution, normalized by subtracting by
2
m
+
3
8
-
1
4
m
1
/
2
{\displaystyle 2{\sqrt {m+{\tfrac {3}{8}}}}-{\tfrac {1}{4\,m^{1/2}}}}
, and
s
{\displaystyle \sigma }
is its standard distribution (estimated empirically). We notice that
m
3
/
2
m
{\displaystyle m^{3/2}\mu }
and
m
2
(
s
-
1
)
{\displaystyle m^{2}(\sigma -1)}
remains roughly in the range of
[
0
,
10
]
{\displaystyle [0,10]}
over the period, giving empirical support for
m
=
O
(
m
-
3
/
2
)
,
s
=
1
+
O
(
m
-
2
)
{\displaystyle \mu =O(m^{-3/2}),\sigma =1+O(m^{-2})} Anscombe transform animated.gif
Anscombe transform animated. Here is the mean of the Anscombe-transformed Poisson distribution, normalized by subtracting by , and is its standard distribution (estimated empirically). We notice that and remains roughly in the range of over the period, giving empirical support for

Definition

For the Poisson distribution the mean and variance are not independent: . The Anscombe transform [1]

aims at transforming the data so that the variance is set approximately 1 for large enough mean; for mean zero, the variance is still zero.

It transforms Poissonian data (with mean ) to approximately Gaussian data of mean and standard deviation . This approximation gets more accurate for larger , [2] as can be also seen in the figure.

For a transformed variable of the form , the expression for the variance has an additional term ; it is reduced to zero at , which is exactly the reason why this value was picked.

Inversion

When the Anscombe transform is used in denoising (i.e. when the goal is to obtain from an estimate of ), its inverse transform is also needed in order to return the variance-stabilized and denoised data to the original range. Applying the algebraic inverse

usually introduces undesired bias to the estimate of the mean , because the forward square-root transform is not linear. Sometimes using the asymptotically unbiased inverse [1]

mitigates the issue of bias, but this is not the case in photon-limited imaging, for which the exact unbiased inverse given by the implicit mapping [3]

should be used. A closed-form approximation of this exact unbiased inverse is [4]

Alternatives

There are many other possible variance-stabilizing transformations for the Poisson distribution. Bar-Lev and Enis report [2] a family of such transformations which includes the Anscombe transform. Another member of the family is the Freeman-Tukey transformation [5]

A simplified transformation, obtained as the primitive of the reciprocal of the standard deviation of the data, is

which, while it is not quite so good at stabilizing the variance, has the advantage of being more easily understood. Indeed, from the delta method,

.

Generalization

While the Anscombe transform is appropriate for pure Poisson data, in many applications the data presents also an additive Gaussian component. These cases are treated by a Generalized Anscombe transform [6] and its asymptotically unbiased or exact unbiased inverses. [7]

See also

Related Research Articles

<span class="mw-page-title-main">Discrete Fourier transform</span> Type of Fourier transform in discrete mathematics

In mathematics, the discrete Fourier transform (DFT) converts a finite sequence of equally-spaced samples of a function into a same-length sequence of equally-spaced samples of the discrete-time Fourier transform (DTFT), which is a complex-valued function of frequency. The interval at which the DTFT is sampled is the reciprocal of the duration of the input sequence. An inverse DFT (IDFT) is a Fourier series, using the DTFT samples as coefficients of complex sinusoids at the corresponding DTFT frequencies. It has the same sample-values as the original input sequence. The DFT is therefore said to be a frequency domain representation of the original input sequence. If the original sequence spans all the non-zero values of a function, its DTFT is continuous, and the DFT provides discrete samples of one cycle. If the original sequence is one cycle of a periodic function, the DFT provides all the non-zero values of one DTFT cycle.

In probability theory and statistics, kurtosis is a measure of the "tailedness" of the probability distribution of a real-valued random variable. Like skewness, kurtosis describes a particular aspect of a probability distribution. There are different ways to quantify kurtosis for a theoretical distribution, and there are corresponding ways of estimating it using a sample from a population. Different measures of kurtosis may have different interpretations.

<span class="mw-page-title-main">Normal distribution</span> Probability distribution

In statistics, a normal distribution or Gaussian distribution is a type of continuous probability distribution for a real-valued random variable. The general form of its probability density function is

Students <i>t</i>-distribution Probability distribution

In probability and statistics, Student's t-distribution is a continuous probability distribution that generalizes the standard normal distribution. Like the latter, it is symmetric around zero and bell-shaped.

<span class="mw-page-title-main">Chi-squared distribution</span> Probability distribution and special case of gamma distribution

In probability theory and statistics, the chi-squared distribution with degrees of freedom is the distribution of a sum of the squares of independent standard normal random variables. The chi-squared distribution is a special case of the gamma distribution and is one of the most widely used probability distributions in inferential statistics, notably in hypothesis testing and in construction of confidence intervals. This distribution is sometimes called the central chi-squared distribution, a special case of the more general noncentral chi-squared distribution.

<span class="mw-page-title-main">Pearson correlation coefficient</span> Measure of linear correlation

In statistics, the Pearson correlation coefficient ― also known as Pearson's r, the Pearson product-moment correlation coefficient (PPMCC), the bivariate correlation, or colloquially simply as the correlation coefficient ― is a measure of linear correlation between two sets of data. It is the ratio between the covariance of two variables and the product of their standard deviations; thus, it is essentially a normalized measurement of the covariance, such that the result always has a value between −1 and 1. As with covariance itself, the measure can only reflect a linear correlation of variables, and ignores many other types of relationships or correlations. As a simple example, one would expect the age and height of a sample of teenagers from a high school to have a Pearson correlation coefficient significantly greater than 0, but less than 1.

In mathematics, a Gaussian function, often simply referred to as a Gaussian, is a function of the base form

<span class="mw-page-title-main">Fisher transformation</span> Statistical transformation

In statistics, the Fisher transformation of a Pearson correlation coefficient is its inverse hyperbolic tangent (artanh). When the sample correlation coefficient r is near 1 or -1, its distribution is highly skewed, which makes it difficult to estimate confidence intervals and apply tests of significance for the population correlation coefficient ρ. The Fisher transformation solves this problem by yielding a variable whose distribution is approximately normally distributed, with a variance that is stable over different values of r.

<span class="mw-page-title-main">Stable distribution</span> Distribution of variables which satisfies a stability property under linear combinations

In probability theory, a distribution is said to be stable if a linear combination of two independent random variables with this distribution has the same distribution, up to location and scale parameters. A random variable is said to be stable if its distribution is stable. The stable distribution family is also sometimes referred to as the Lévy alpha-stable distribution, after Paul Lévy, the first mathematician to have studied it.

<span class="mw-page-title-main">Continuous uniform distribution</span> Uniform distribution on an interval

In probability theory and statistics, the continuous uniform distributions or rectangular distributions are a family of symmetric probability distributions. Such a distribution describes an experiment where there is an arbitrary outcome that lies between certain bounds. The bounds are defined by the parameters, and which are the minimum and maximum values. The interval can either be closed or open. Therefore, the distribution is often abbreviated where stands for uniform distribution. The difference between the bounds defines the interval length; all intervals of the same length on the distribution's support are equally probable. It is the maximum entropy probability distribution for a random variable under no constraint other than that it is contained in the distribution's support.

<span class="mw-page-title-main">Generalized inverse Gaussian distribution</span>

In probability theory and statistics, the generalized inverse Gaussian distribution (GIG) is a three-parameter family of continuous probability distributions with probability density function

<span class="mw-page-title-main">Inverse Gaussian distribution</span> Family of continuous probability distributions

In probability theory, the inverse Gaussian distribution is a two-parameter family of continuous probability distributions with support on (0,∞).

<span class="mw-page-title-main">Gaussian filter</span> Filter in electronics and signal processing

In electronics and signal processing, mainly in digital signal processing, a Gaussian filter is a filter whose impulse response is a Gaussian function. Gaussian filters have the properties of having no overshoot to a step function input while minimizing the rise and fall time. This behavior is closely connected to the fact that the Gaussian filter has the minimum possible group delay. A Gaussian filter will have the best combination of suppression of high frequencies while also minimizing spatial spread, being the critical point of the uncertainty principle. These properties are important in areas such as oscilloscopes and digital telecommunication systems.

In statistics, the bias of an estimator is the difference between this estimator's expected value and the true value of the parameter being estimated. An estimator or decision rule with zero bias is called unbiased. In statistics, "bias" is an objective property of an estimator. Bias is a distinct concept from consistency: consistent estimators converge in probability to the true value of the parameter, but may be biased or unbiased; see bias versus consistency for more.

A ratio distribution is a probability distribution constructed as the distribution of the ratio of random variables having two other known distributions. Given two random variables X and Y, the distribution of the random variable Z that is formed as the ratio Z = X/Y is a ratio distribution.

<span class="mw-page-title-main">Data transformation (statistics)</span>

In statistics, data transformation is the application of a deterministic mathematical function to each point in a data set—that is, each data point zi is replaced with the transformed value yi = f(zi), where f is a function. Transforms are usually applied so that the data appear to more closely meet the assumptions of a statistical inference procedure that is to be applied, or to improve the interpretability or appearance of graphs.

In probability and statistics, the Tweedie distributions are a family of probability distributions which include the purely continuous normal, gamma and inverse Gaussian distributions, the purely discrete scaled Poisson distribution, and the class of compound Poisson–gamma distributions which have positive mass at zero, but are otherwise continuous. Tweedie distributions are a special case of exponential dispersion models and are often used as distributions for generalized linear models.

In applied statistics, a variance-stabilizing transformation is a data transformation that is specifically chosen either to simplify considerations in graphical exploratory data analysis or to allow the application of simple regression-based or analysis of variance techniques.

<span class="mw-page-title-main">Poisson distribution</span> Discrete probability distribution

In probability theory and statistics, the Poisson distribution is a discrete probability distribution that expresses the probability of a given number of events occurring in a fixed interval of time or space if these events occur with a known constant mean rate and independently of the time since the last event. It is named after French mathematician Siméon Denis Poisson. The Poisson distribution can also be used for the number of events in other specified interval types such as distance, area, or volume. It plays an important role for discrete-stable distributions.

<span class="mw-page-title-main">Exponentially modified Gaussian distribution</span> Describes the sum of independent normal and exponential random variables

In probability theory, an exponentially modified Gaussian distribution describes the sum of independent normal and exponential random variables. An exGaussian random variable Z may be expressed as Z = X + Y, where X and Y are independent, X is Gaussian with mean μ and variance σ2, and Y is exponential of rate λ. It has a characteristic positive skew from the exponential component.

References

  1. 1 2 Anscombe, F. J. (1948), "The transformation of Poisson, binomial and negative-binomial data", Biometrika, [Oxford University Press, Biometrika Trust], vol. 35, no. 3–4, pp. 246–254, doi:10.1093/biomet/35.3-4.246, JSTOR   2332343
  2. 1 2 Bar-Lev, S. K.; Enis, P. (1988), "On the classical choice of variance stabilizing transformations and an application for a Poisson variate", Biometrika, vol. 75, no. 4, pp. 803–804, doi:10.1093/biomet/75.4.803
  3. Mäkitalo, M.; Foi, A. (2011), "Optimal inversion of the Anscombe transformation in low-count Poisson image denoising", IEEE Transactions on Image Processing, vol. 20, no. 1, pp. 99–109, Bibcode:2011ITIP...20...99M, CiteSeerX   10.1.1.219.6735 , doi:10.1109/TIP.2010.2056693, PMID   20615809, S2CID   10229455
  4. Mäkitalo, M.; Foi, A. (2011), "A closed-form approximation of the exact unbiased inverse of the Anscombe variance-stabilizing transformation", IEEE Transactions on Image Processing, vol. 20, no. 9, pp. 2697–2698, Bibcode:2011ITIP...20.2697M, doi:10.1109/TIP.2011.2121085, PMID   21356615, S2CID   7937596
  5. Freeman, M. F.; Tukey, J. W. (1950), "Transformations related to the angular and the square root", The Annals of Mathematical Statistics, vol. 21, no. 4, pp. 607–611, doi: 10.1214/aoms/1177729756 , JSTOR   2236611
  6. Starck, J.L.; Murtagh, F.; Bijaoui, A. (1998). Image Processing and Data Analysis . Cambridge University Press. ISBN   9780521599146.
  7. Mäkitalo, M.; Foi, A. (2013), "Optimal inversion of the generalized Anscombe transformation for Poisson-Gaussian noise", IEEE Transactions on Image Processing, vol. 22, no. 1, pp. 91–103, Bibcode:2013ITIP...22...91M, doi:10.1109/TIP.2012.2202675, PMID   22692910, S2CID   206724566

Further reading