Fano factor

Last updated

In statistics, the Fano factor, [1] like the coefficient of variation, is a measure of the dispersion of a counting process. It was originally used to measure the Fano noise in ion detectors. It is named after Ugo Fano, an Italian-American physicist.

Contents

The Fano factor after a time is defined as

where is the standard deviation and is the mean number of events of a counting process after some time . The Fano factor can be viewed as a kind of noise-to-signal ratio; it is a measure of the reliability with which the waiting time random variable can be estimated after several random events.

For a Poisson counting process, the variance in the count equals the mean count, so .

Definition

For a counting process , the Fano factor after a time is defined as,

Sometimes, the long-term limit is also termed the Fano factor,

For a renewal process with holding times distributed similar to a random variable , we have that,

[2]

Since we have that the right-hand side is equal to the square of the coefficient of variation , the right-hand side of this equation is sometimes referred to as the Fano factor as well. [3]

Interpretation

When considered as the dispersion of the number, the Fano factor roughly corresponds to the width of the peak of . As such, the Fano factor is often interpreted as the unpredictability of the underlying process.

Example: Constant Random Variable

When the holding times are constant, then . As such, if then we interpret the renewal process as being very predictable.

Example: Poisson Counting Process

When the likelihood of an event occurring in any time interval is equal for all time, then the holding times must be exponentially distributed, giving a Poisson counting process, for which .

Use in particle detection

In particle detectors, the Fano factor results from the energy loss in a collision not being purely statistical. The process giving rise to each individual charge carrier is not independent as the number of ways an atom may be ionized is limited by the discrete electron shells. The net result is a better energy resolution than predicted by purely statistical considerations. For example, if w is the average energy for a particle to produce a charge carrier in a detector, then the relative FWHM resolution for measuring the particle energy E is: [4]

where the factor of 2.35 relates the standard deviation to the FWHM.

The Fano factor is material-specific. Some theoretical values are: [5]

Si:0.115 (note discrepancy to experimental value)
Ge:0.13
GaAs:0.12 [6]
Diamond:0.08

Measuring the Fano factor is difficult because many factors contribute to the resolution, but some experimental values are:

Si:0.128 ± 0.001 [7] (at 5.9 keV) / 0.159 ± 0.002 (at 122 keV) [7]
Ar (gas):0.20 ± 0.01/0.02 [8]
Xe (gas):0.13 to 0.29 [9]
CZT:0.089 ± 0.005 [10]

Use in neuroscience

The Fano factor is used in neuroscience to describe variability in neural spiking. [11] In this context, the events are the neural spiking events and the holding times are the Inter-Spike Intervals (ISI). Often, the limit definition of the Fano factor is used, for which,

where is the coefficient of variation of ISI.

Some neurons are found to have varying ISI distributions, meaning that the counting process is no longer a renewal process. Rather, a Markov renewal process is used. In the case that we have only two Markov states with equal transition probabilities , we have that the limit above again converges, [12] where represents the mean for the ISI of the corresponding state.

While most work assumes a constant Fano factor, recent work has considered neurons with non-constant Fano factors. [13] In this case, it is found that non-constant Fano factors can be achieved by introducing both noise and non-linearity to the rate of the underlying Poisson process.

See also

Related Research Articles

<span class="mw-page-title-main">Variance</span> Statistical measure of how far values spread from their average

In probability theory and statistics, variance is the squared deviation from the mean of a random variable. The standard deviation is obtained as the square root of the variance. Variance is a measure of dispersion, meaning it is a measure of how far a set of numbers is spread out from their average value. It is the second central moment of a distribution, and the covariance of the random variable with itself, and it is often represented by , , , , or .

In probability theory, the central limit theorem (CLT) establishes that, in many situations, for independent and identically distributed random variables, the sampling distribution of the standardized sample mean tends towards the standard normal distribution even if the original variables themselves are not normally distributed.

<span class="mw-page-title-main">Log-normal distribution</span> Probability distribution

In probability theory, a log-normal (or lognormal) distribution is a continuous probability distribution of a random variable whose logarithm is normally distributed. Thus, if the random variable X is log-normally distributed, then Y = ln(X) has a normal distribution. Equivalently, if Y has a normal distribution, then the exponential function of Y, X = exp(Y), has a log-normal distribution. A random variable which is log-normally distributed takes only positive real values. It is a convenient and useful model for measurements in exact and engineering sciences, as well as medicine, economics and other topics (e.g., energies, concentrations, lengths, prices of financial instruments, and other metrics).

In the mathematical field of real analysis, the monotone convergence theorem is any of a number of related theorems proving the convergence of monotonic sequences that are also bounded. Informally, the theorems state that if a sequence is increasing and bounded above by a supremum, then the sequence will converge to the supremum; in the same way, if a sequence is decreasing and is bounded below by an infimum, it will converge to the infimum.

<span class="mw-page-title-main">Wiener process</span> Stochastic process generalizing Brownian motion

In mathematics, the Wiener process is a real-valued continuous-time stochastic process named in honor of American mathematician Norbert Wiener for his investigations on the mathematical properties of the one-dimensional Brownian motion. It is often also called Brownian motion due to its historical connection with the physical process of the same name originally observed by Scottish botanist Robert Brown. It is one of the best known Lévy processes and occurs frequently in pure and applied mathematics, economics, quantitative finance, evolutionary biology, and physics.

<span class="mw-page-title-main">Law of large numbers</span> Averages of repeated trials converge to the expected value

In probability theory, the law of large numbers (LLN) is a theorem that describes the result of performing the same experiment a large number of times. According to the law, the average of the results obtained from a large number of independent identical trials should be close to the expected value and tends to become closer to the expected value as more trials are performed. Additionally, the LLN has 2 forms: the Weak Law of Large Numbers and the Strong Law of Large Numbers.

<span class="mw-page-title-main">Beta distribution</span> Probability distribution

In probability theory and statistics, the beta distribution is a family of continuous probability distributions defined on the interval [0, 1] or in terms of two positive parameters, denoted by alpha (α) and beta (β), that appear as exponents of the variable and its complement to 1, respectively, and control the shape of the distribution.

In mathematics, a Gaussian function, often simply referred to as a Gaussian, is a function of the base form

In mathematics, Fatou's lemma establishes an inequality relating the Lebesgue integral of the limit inferior of a sequence of functions to the limit inferior of integrals of these functions. The lemma is named after Pierre Fatou.

In probability theory and statistics, a Gaussian process is a stochastic process, such that every finite collection of those random variables has a multivariate normal distribution. The distribution of a Gaussian process is the joint distribution of all those random variables, and as such, it is a distribution over functions with a continuous domain, e.g. time or space.

In mathematics, the moments of a function are certain quantitative measures related to the shape of the function's graph. If the function represents mass density, then the zeroth moment is the total mass, the first moment is the center of mass, and the second moment is the moment of inertia. If the function is a probability distribution, then the first moment is the expected value, the second central moment is the variance, the third standardized moment is the skewness, and the fourth standardized moment is the kurtosis. The mathematical concept is closely related to the concept of moment in physics.

<span class="mw-page-title-main">Polylogarithm</span> Special mathematical function

In mathematics, the polylogarithm (also known as Jonquière's function, for Alfred Jonquière) is a special function Lis(z) of order s and argument z. Only for special values of s does the polylogarithm reduce to an elementary function such as the natural logarithm or a rational function. In quantum statistics, the polylogarithm function appears as the closed form of integrals of the Fermi–Dirac distribution and the Bose–Einstein distribution, and is also known as the Fermi–Dirac integral or the Bose–Einstein integral. In quantum electrodynamics, polylogarithms of positive integer order arise in the calculation of processes represented by higher-order Feynman diagrams.

<span class="mw-page-title-main">Logistic distribution</span> Continuous probability distribution

In probability theory and statistics, the logistic distribution is a continuous probability distribution. Its cumulative distribution function is the logistic function, which appears in logistic regression and feedforward neural networks. It resembles the normal distribution in shape but has heavier tails. The logistic distribution is a special case of the Tukey lambda distribution.

In mathematics, the total variation identifies several slightly different concepts, related to the (local or global) structure of the codomain of a function or a measure. For a real-valued continuous function f, defined on an interval [a, b] ⊂ R, its total variation on the interval of definition is a measure of the one-dimensional arclength of the curve with parametric equation xf(x), for x ∈ [a, b]. Functions whose total variation is finite are called functions of bounded variation.

Renewal theory is the branch of probability theory that generalizes the Poisson process for arbitrary holding times. Instead of exponentially distributed holding times, a renewal process may have any independent and identically distributed (IID) holding times that have finite mean. A renewal-reward process additionally has a random sequence of rewards incurred at each holding time, which are IID but need not be independent of the holding times.

In probability and statistics, the Tweedie distributions are a family of probability distributions which include the purely continuous normal, gamma and inverse Gaussian distributions, the purely discrete scaled Poisson distribution, and the class of compound Poisson–gamma distributions which have positive mass at zero, but are otherwise continuous. Tweedie distributions are a special case of exponential dispersion models and are often used as distributions for generalized linear models.

In probability theory and statistics, the index of dispersion, dispersion index,coefficient of dispersion,relative variance, or variance-to-mean ratio (VMR), like the coefficient of variation, is a normalized measure of the dispersion of a probability distribution: it is a measure used to quantify whether a set of observed occurrences are clustered or dispersed compared to a standard statistical model.

In information theory and statistics, Kullback's inequality is a lower bound on the Kullback–Leibler divergence expressed in terms of the large deviations rate function. If P and Q are probability distributions on the real line, such that P is absolutely continuous with respect to Q, i.e. P << Q, and whose first moments exist, then

In queueing theory, a discipline within the mathematical theory of probability, a heavy traffic approximation is the matching of a queueing model with a diffusion process under some limiting conditions on the model's parameters. The first such result was published by John Kingman who showed that when the utilisation parameter of an M/M/1 queue is near 1 a scaled version of the queue length process can be accurately approximated by a reflected Brownian motion.

In the mathematical theory of random processes, the Markov chain central limit theorem has a conclusion somewhat similar in form to that of the classic central limit theorem (CLT) of probability theory, but the quantity in the role taken by the variance in the classic CLT has a more complicated definition. See also the general form of Bienaymé's identity.

References

  1. Fano, U. (1947). "Ionization Yield of Radiations. II. The Fluctuations of the Number of Ions". Physical Review. 72 (1): 26–29. Bibcode:1947PhRv...72...26F. doi:10.1103/PhysRev.72.26.
  2. Cox, D.R. (1962). Renewal Theory.
  3. Shuai, J. W.; Zeng, S.; Jung, P. (2002). "Coherence resonance: on the use and abuse of the fano factor". Fluct. Noise Lett. 02 (3): L139–L146. doi:10.1142/S0219477502000749.
  4. Leo, W.R. (1987). Techniques for Nuclear and Particle Physics Experiments: An How-to Approach . Springer-Verlag. pp.  109–125. ISBN   978-3-540-17386-1.
  5. Alig, R.; Bloom, S.; Struck, C. (1980). "Scattering by ionization and phonon emission in semiconductors". Physical Review B. 22 (12): 5565. Bibcode:1980PhRvB..22.5565A. doi:10.1103/PhysRevB.22.5565.
  6. G. Bertuccio, D. Maiocchi J. Appl. Phys., 92 (2002), p. 1248
  7. 1 2 Kotov, I. V.; Neal, H.; O’Connor, P. (2018-09-01). "Pair creation energy and Fano factor of silicon measured at 185 K using 55Fe X-rays". Nuclear Instruments and Methods in Physics Research Section A: Accelerators, Spectrometers, Detectors and Associated Equipment. 901: 126–132. doi: 10.1016/j.nima.2018.06.022 . ISSN   0168-9002.
  8. Kase, M.; Akioka, T.; Mamyoda, H.; Kikuchi, J.; Doke, T. (1984). "Fano factor in pure argon". Nuclear Instruments and Methods in Physics Research Section A: Accelerators, Spectrometers, Detectors and Associated Equipment. 227 (2): 311. Bibcode:1984NIMPA.227..311K. doi:10.1016/0168-9002(84)90139-6.
  9. Do Carmo, S. J. C.; Borges, F. I. G. M.; Vinagre, F. L. R.; Conde, C. A. N. (2008). "Experimental Study of the -Values and Fano Factors of Gaseous Xenon and Ar-Xe Mixtures for X-Rays". IEEE Transactions on Nuclear Science. 55 (5): 2637. Bibcode:2008ITNS...55.2637D. doi:10.1109/TNS.2008.2003075. S2CID   43581597.
  10. Redus, R. H.; Pantazis, J. A.; Huber, A. C.; Jordanov, V. T.; Butler, J. F.; Apotovsky, B. (2011). "Fano Factor Determination For CZT". MRS Proceedings. 487. doi:10.1557/PROC-487-101.
  11. Dayan, Peter; Abbott, L. F. (2001). Theoretical Neuroscience.
  12. Ball, F.; Milne, R. K. (2005). "Simple derivations of properties of counting processes associated with Markov renewal processes".{{cite journal}}: Cite journal requires |journal= (help)
  13. Charles, Adam S.; Park; Weller; Horwitz; Pillow (2018). "Dethroning the Fano Factor: A Flexible, Model-Based Approach to Partitioning Neural Variability". Neural Computation. 30 (4): 1012–1045. doi:10.1162/neco_a_01062. PMC   6558056 . PMID   29381442.