In financial mathematics, a risk measure is used to determine the amount of an asset or set of assets (traditionally currency) to be kept in reserve. The purpose of this reserve is to make the risks taken by financial institutions, such as banks and insurance companies, acceptable to the regulator. In recent years attention has turned towards convex and coherent risk measurement.
A risk measure is defined as a mapping from a set of random variables to the real numbers. This set of random variables represents portfolio returns. The common notation for a risk measure associated with a random variable is . A risk measure should have certain properties: [1]
In a situation with -valued portfolios such that risk can be measured in of the assets, then a set of portfolios is the proper way to depict risk. Set-valued risk measures are useful for markets with transaction costs. [2]
A set-valued risk measure is a function , where is a -dimensional Lp space, , and where is a constant solvency cone and is the set of portfolios of the reference assets. must have the following properties: [3]
Variance (or standard deviation) is not a risk measure in the above sense. This can be seen since it has neither the translation property nor monotonicity. That is, for all , and a simple counterexample for monotonicity can be found. The standard deviation is a deviation risk measure. To avoid any confusion, note that deviation risk measures, such as variance and standard deviation are sometimes called risk measures in different fields.
There is a one-to-one correspondence between an acceptance set and a corresponding risk measure. As defined below it can be shown that and . [5]
There is a one-to-one relationship between a deviation risk measure D and an expectation-bounded risk measure where for any
is called expectation bounded if it satisfies for any nonconstant X and for any constant X. [6]
In probability theory and statistics, the multivariate normal distribution, multivariate Gaussian distribution, or joint normal distribution is a generalization of the one-dimensional (univariate) normal distribution to higher dimensions. One definition is that a random vector is said to be k-variate normally distributed if every linear combination of its k components has a univariate normal distribution. Its importance derives mainly from the multivariate central limit theorem. The multivariate normal distribution is often used to describe, at least approximately, any set of (possibly) correlated real-valued random variables, each of which clusters around a mean value.
In mathematical analysis, Hölder's inequality, named after Otto Hölder, is a fundamental inequality between integrals and an indispensable tool for the study of Lp spaces.
In mathematics, a Lie algebroid is a vector bundle together with a Lie bracket on its space of sections and a vector bundle morphism , satisfying a Leibniz rule. A Lie algebroid can thus be thought of as a "many-object generalisation" of a Lie algebra.
In the field of representation theory in mathematics, a projective representation of a group G on a vector space V over a field F is a group homomorphism from G to the projective linear group
The dyadic transformation is the mapping
Time consistency in the context of finance is the property of not having mutually contradictory evaluations of risk at different points in time. This property implies that if investment A is considered riskier than B at some future time, then A will also be considered riskier than B at every prior time.
In mathematics, a local system on a topological space X is a tool from algebraic topology which interpolates between cohomology with coefficients in a fixed abelian group A, and general sheaf cohomology in which coefficients vary from point to point. Local coefficient systems were introduced by Norman Steenrod in 1943.
In information theory, information dimension is an information measure for random vectors in Euclidean space, based on the normalized entropy of finely quantized versions of the random vectors. This concept was first introduced by Alfréd Rényi in 1959.
In the fields of actuarial science and financial economics there are a number of ways that risk can be defined; to clarify the concept theoreticians have described a number of properties that a risk measure might or might not have. A coherent risk measure is a function that satisfies properties of monotonicity, sub-additivity, homogeneity, and translational invariance.
Robust optimization is a field of mathematical optimization theory that deals with optimization problems in which a certain measure of robustness is sought against uncertainty that can be represented as deterministic variability in the value of the parameters of the problem itself and/or its solution. It is related to, but often distinguished from, probabilistic optimization methods such as chance-constrained optimization.
Algebraic signal processing (ASP) is an emerging area of theoretical signal processing (SP). In the algebraic theory of signal processing, a set of filters is treated as an (abstract) algebra, a set of signals is treated as a module or vector space, and convolution is treated as an algebra representation. The advantage of algebraic signal processing is its generality and portability.
In mathematics, especially measure theory, a set function is a function whose domain is a family of subsets of some given set and that (usually) takes its values in the extended real number line which consists of the real numbers and
The superhedging price is a coherent risk measure. The superhedging price of a portfolio (A) is equivalent to the smallest amount necessary to be paid for an admissible portfolio (B) at the current time so that at some specified future time the value of B is at least as great as A. In a complete market the superhedging price is equivalent to the price for hedging the initial portfolio.
In financial mathematics, a conditional risk measure is a random variable of the financial risk as if measured at some point in the future. A risk measure can be thought of as a conditional risk measure on the trivial sigma algebra.
In financial mathematics, the entropic risk measure is a risk measure which depends on the risk aversion of the user through the exponential utility function. It is a possible alternative to other risk measures as value-at-risk or expected shortfall.
In financial mathematics, acceptance set is a set of acceptable future net worth which is acceptable to the regulator. It is related to risk measures.
Good–deal bounds are price bounds for a financial portfolio which depends on an individual trader's preferences. Mathematically, if is a set of portfolios with future outcomes which are "acceptable" to the trader, then define the function by
In financial mathematics and economics, a distortion risk measure is a type of risk measure which is related to the cumulative distribution function of the return of a financial portfolio.
In quantum information theory, the Wehrl entropy, named after Alfred Wehrl, is a classical entropy of a quantum-mechanical density matrix. It is a type of quasi-entropy defined for the Husimi Q representation of the phase-space quasiprobability distribution. See for a comprehensive review of basic properties of classical, quantum and Wehrl entropies, and their implications in statistical mechanics.
In financial mathematics and stochastic optimization, the concept of risk measure is used to quantify the risk involved in a random outcome or risk position. Many risk measures have hitherto been proposed, each having certain characteristics. The entropic value at risk (EVaR) is a coherent risk measure introduced by Ahmadi-Javid, which is an upper bound for the value at risk (VaR) and the conditional value at risk (CVaR), obtained from the Chernoff inequality. The EVaR can also be represented by using the concept of relative entropy. Because of its connection with the VaR and the relative entropy, this risk measure is called "entropic value at risk". The EVaR was developed to tackle some computational inefficiencies of the CVaR. Getting inspiration from the dual representation of the EVaR, Ahmadi-Javid developed a wide class of coherent risk measures, called g-entropic risk measures. Both the CVaR and the EVaR are members of this class.