This article may be too technical for most readers to understand.(August 2013) |
In the fields of actuarial science and financial economics there are a number of ways that risk can be defined; to clarify the concept theoreticians have described a number of properties that a risk measure might or might not have. A coherent risk measure is a function that satisfies properties of monotonicity, sub-additivity, homogeneity, and translational invariance.
Consider a random outcome viewed as an element of a linear space of measurable functions, defined on an appropriate probability space. A functional → is said to be coherent risk measure for if it satisfies the following properties: [1]
That is, the risk when holding no assets is zero.
That is, if portfolio always has better values than portfolio under almost all scenarios then the risk of should be less than the risk of . [2] E.g. If is an in the money call option (or otherwise) on a stock, and is also an in the money call option with a lower strike price. In financial risk management, monotonicity implies a portfolio with greater future returns has less risk.
Indeed, the risk of two portfolios together cannot get any worse than adding the two risks separately: this is the diversification principle. In financial risk management, sub-additivity implies diversification is beneficial. The sub-additivity principle is sometimes also seen as problematic. [3] [4]
Loosely speaking, if you double your portfolio then you double your risk. In financial risk management, positive homogeneity implies the risk of a position is proportional to its size.
If is a deterministic portfolio with guaranteed return and then
The portfolio is just adding cash to your portfolio . In particular, if then . In financial risk management, translation invariance implies that the addition of a sure amount of capital reduces the risk by the same amount.
The notion of coherence has been subsequently relaxed. Indeed, the notions of Sub-additivity and Positive Homogeneity can be replaced by the notion of convexity: [5]
It is well known that value at risk is not a coherent risk measure as it does not respect the sub-additivity property. An immediate consequence is that value at risk might discourage diversification. [1] Value at risk is, however, coherent, under the assumption of elliptically distributed losses (e.g. normally distributed) when the portfolio value is a linear function of the asset prices. However, in this case the value at risk becomes equivalent to a mean-variance approach where the risk of a portfolio is measured by the variance of the portfolio's return.
The Wang transform function (distortion function) for the Value at Risk is . The non-concavity of proves the non coherence of this risk measure.
As a simple example to demonstrate the non-coherence of value-at-risk consider looking at the VaR of a portfolio at 95% confidence over the next year of two default-able zero coupon bonds that mature in 1 years time denominated in our numeraire currency.
Assume the following:
Under these conditions the 95% VaR for holding either of the bonds is 0 since the probability of default is less than 5%. However if we held a portfolio that consisted of 50% of each bond by value then the 95% VaR is 35% (= 0.5*0.7 + 0.5*0) since the probability of at least one of the bonds defaulting is 7.84% (= 1 - 0.96*0.96) which exceeds 5%. This violates the sub-additivity property showing that VaR is not a coherent risk measure.
The average value at risk (sometimes called expected shortfall or conditional value-at-risk or ) is a coherent risk measure, even though it is derived from Value at Risk which is not. The domain can be extended for more general Orlitz Hearts from the more typical Lp spaces. [6]
The entropic value at risk is a coherent risk measure. [7]
The tail value at risk (or tail conditional expectation) is a coherent risk measure only when the underlying distribution is continuous.
The Wang transform function (distortion function) for the tail value at risk is . The concavity of proves the coherence of this risk measure in the case of continuous distribution.
The PH risk measure (or Proportional Hazard Risk measure) transforms the hazard rates using a coefficient .
The Wang transform function (distortion function) for the PH risk measure is . The concavity of if proves the coherence of this risk measure.
g-entropic risk measures are a class of information-theoretic coherent risk measures that involve some important cases such as CVaR and EVaR. [7]
The Wang risk measure is defined by the following Wang transform function (distortion function) . The coherence of this risk measure is a consequence of the concavity of .
The entropic risk measure is a convex risk measure which is not coherent. It is related to the exponential utility.
The superhedging price is a coherent risk measure.
In a situation with -valued portfolios such that risk can be measured in of the assets, then a set of portfolios is the proper way to depict risk. Set-valued risk measures are useful for markets with transaction costs. [8]
A set-valued coherent risk measure is a function , where and where is a constant solvency cone and is the set of portfolios of the reference assets. must have the following properties: [9]
A Wang transform of the cumulative distribution function is an increasing function where and . [10] This function is called distortion function or Wang transform function.
The dual distortion function is . [11] [12] Given a probability space , then for any random variable and any distortion function we can define a new probability measure such that for any it follows that [11]
For any increasing concave Wang transform function, we could define a corresponding premium principle : [10]
A coherent risk measure could be defined by a Wang transform of the cumulative distribution function if and only if is concave. [10]
If instead of the sublinear property,R is convex, then R is a set-valued convex risk measure.
A lower semi-continuous convex risk measure can be represented as
such that is a penalty function and is the set of probability measures absolutely continuous with respect to P (the "real world" probability measure), i.e. . The dual characterization is tied to spaces, Orlitz hearts, and their dual spaces. [6]
A lower semi-continuous risk measure is coherent if and only if it can be represented as
such that . [13]
In mathematics, the Lp spaces are function spaces defined using a natural generalization of the p-norm for finite-dimensional vector spaces. They are sometimes called Lebesgue spaces, named after Henri Lebesgue, although according to the Bourbaki group they were first introduced by Frigyes Riesz.
In mathematical analysis, Hölder's inequality, named after Otto Hölder, is a fundamental inequality between integrals and an indispensable tool for the study of Lp spaces.
In mathematics, the beta function, also called the Euler integral of the first kind, is a special function that is closely related to the gamma function and to binomial coefficients. It is defined by the integral
The theory of functions of several complex variables is the branch of mathematics dealing with functions defined on the complex coordinate space, that is, n-tuples of complex numbers. The name of the field dealing with the properties of these functions is called several complex variables, which the Mathematics Subject Classification has as a top-level heading.
Variational Bayesian methods are a family of techniques for approximating intractable integrals arising in Bayesian inference and machine learning. They are typically used in complex statistical models consisting of observed variables as well as unknown parameters and latent variables, with various sorts of relationships among the three types of random variables, as might be described by a graphical model. As typical in Bayesian inference, the parameters and latent variables are grouped together as "unobserved variables". Variational Bayesian methods are primarily used for two purposes:
The dyadic transformation is the mapping
Time consistency in the context of finance is the property of not having mutually contradictory evaluations of risk at different points in time. This property implies that if investment A is considered riskier than B at some future time, then A will also be considered riskier than B at every prior time.
In mathematics, a π-system on a set is a collection of certain subsets of such that
In information theory, information dimension is an information measure for random vectors in Euclidean space, based on the normalized entropy of finely quantized versions of the random vectors. This concept was first introduced by Alfréd Rényi in 1959.
Expected shortfall (ES) is a risk measure—a concept used in the field of financial risk measurement to evaluate the market risk or credit risk of a portfolio. The "expected shortfall at q% level" is the expected return on the portfolio in the worst of cases. ES is an alternative to value at risk that is more sensitive to the shape of the tail of the loss distribution.
In financial mathematics, a risk measure is used to determine the amount of an asset or set of assets to be kept in reserve. The purpose of this reserve is to make the risks taken by financial institutions, such as banks and insurance companies, acceptable to the regulator. In recent years attention has turned towards convex and coherent risk measurement.
A Spectral risk measure is a risk measure given as a weighted average of outcomes where bad outcomes are, typically, included with larger weights. A spectral risk measure is a function of portfolio returns and outputs the amount of the numeraire to be kept in reserve. A spectral risk measure is always a coherent risk measure, but the converse does not always hold. An advantage of spectral measures is the way in which they can be related to risk aversion, and particularly to a utility function, through the weights given to the possible portfolio returns.
In mathematics, the Fourier transform on finite groups is a generalization of the discrete Fourier transform from cyclic to arbitrary finite groups.
The superhedging price is a coherent risk measure. The superhedging price of a portfolio (A) is equivalent to the smallest amount necessary to be paid for an admissible portfolio (B) at the current time so that at some specified future time the value of B is at least as great as A. In a complete market the superhedging price is equivalent to the price for hedging the initial portfolio.
In financial mathematics, acceptance set is a set of acceptable future net worth which is acceptable to the regulator. It is related to risk measures.
Good–deal bounds are price bounds for a financial portfolio which depends on an individual trader's preferences. Mathematically, if is a set of portfolios with future outcomes which are "acceptable" to the trader, then define the function by
In financial mathematics and economics, a distortion risk measure is a type of risk measure which is related to the cumulative distribution function of the return of a financial portfolio.
In quantum information theory, the Wehrl entropy, named after Alfred Wehrl, is a classical entropy of a quantum-mechanical density matrix. It is a type of quasi-entropy defined for the Husimi Q representation of the phase-space quasiprobability distribution. See for a comprehensive review of basic properties of classical, quantum and Wehrl entropies, and their implications in statistical mechanics.
In financial mathematics and stochastic optimization, the concept of risk measure is used to quantify the risk involved in a random outcome or risk position. Many risk measures have hitherto been proposed, each having certain characteristics. The entropic value at risk (EVaR) is a coherent risk measure introduced by Ahmadi-Javid, which is an upper bound for the value at risk (VaR) and the conditional value at risk (CVaR), obtained from the Chernoff inequality. The EVaR can also be represented by using the concept of relative entropy. Because of its connection with the VaR and the relative entropy, this risk measure is called "entropic value at risk". The EVaR was developed to tackle some computational inefficiencies of the CVaR. Getting inspiration from the dual representation of the EVaR, Ahmadi-Javid developed a wide class of coherent risk measures, called g-entropic risk measures. Both the CVaR and the EVaR are members of this class.
In probability theory and statistics, a complex random vector is typically a tuple of complex-valued random variables, and generally is a random variable taking values in a vector space over the field of complex numbers. If are complex-valued random variables, then the n-tuple is a complex random vector. Complex random variables can always be considered as pairs of real random vectors: their real and imaginary parts.
{{cite journal}}
: Cite journal requires |journal=
(help)