In econometrics, the information matrix test is used to determine whether a regression model is misspecified. The test was developed by Halbert White, [1] who observed that in a correctly specified model and under standard regularity assumptions, the Fisher information matrix can be expressed in either of two ways: as the outer product of the gradient of the log-likelihood function, or as a function of its Hessian matrix.
Consider a linear model
, where the errors
are assumed to be distributed
. If the parameters
and
are stacked in the vector
, the resulting log-likelihood function is

The information matrix can then be expressed as

that is the expected value of the outer product of the gradient or score. Second, it can be written as the negative of the Hessian matrix of the log-likelihood function

If the model is correctly specified, both expressions should be equal. Combining the equivalent forms yields

where
is an
random matrix, where
is the number of parameters. White showed that the elements of
, where
is the MLE, are asymptotically normally distributed with zero means when the model is correctly specified. [2] In small samples, however, the test generally performs poorly. [3]
This page is based on this
Wikipedia article Text is available under the
CC BY-SA 4.0 license; additional terms may apply.
Images, videos and audio are available under their respective licenses.