Time consistency in the context of finance is the property of not having mutually contradictory evaluations of risk at different points in time. This property implies that if investment A is considered riskier than B at some future time, then A will also be considered riskier than B at every prior time.
Finance is a field that is concerned with the allocation (investment) of assets and liabilities over space and time, often under conditions of risk or uncertainty. Finance can also be defined as the art of money management. Participants in the market aim to price assets based on their risk level, fundamental value, and their expected rate of return. Finance can be split into three sub-categories: public finance, corporate finance and personal finance.
Risk is the potential for uncontrolled loss of something of value. Values can be gained or lost when taking risk resulting from a given action or inaction, foreseen or unforeseen. Risk can also be defined as the intentional interaction with uncertainty. Uncertainty is a potential, unpredictable, and uncontrollable outcome; risk is an aspect of action taken in spite of uncertainty.
Time consistency is a property in financial risk related to dynamic risk measures. The purpose of the time the consistent property is to categorize the risk measures which satisfy the condition that if portfolio (A) is riskier than portfolio (B) at some time in the future, then it is guaranteed to be riskier at any time prior to that point. This is an important property since if it were not to hold then there is an event (with probability of occurring greater than 0) such that B is riskier than A at time although it is certain that A is riskier than B at time . As the name suggests a time inconsistent risk measure can lead to inconsistent behavior in financial risk management.
Financial risk is any of various types of risk associated with financing, including financial transactions that include company loans in risk of default. Often it is understood to include only downside risk, meaning the potential for financial loss and uncertainty about its extent.
In financial mathematics, a conditional risk measure is a random variable of the financial risk as if measured at some point in the future. A risk measure can be thought of as a conditional risk measure on the trivial sigma algebra.
In financial mathematics, a risk measure is used to determine the amount of an asset or set of assets to be kept in reserve. The purpose of this reserve is to make the risks taken by financial institutions, such as banks and insurance companies, acceptable to the regulator. In recent years attention has turned towards convex and coherent risk measurement.
This article may be too technical for most readers to understand. Please help improve it to make it understandable to non-experts, without removing the technical details. (February 2018) (Learn how and when to remove this template message)
A dynamic risk measure on is time consistent if and implies .
Due to the recursive property it is simple to construct a time consistent risk measure. This is done by composing one-period measures over time. This would mean that:
Both dynamic value at risk and dynamic average value at risk are not a time consistent risk measures.
Value at risk (VaR) is a measure of the risk of loss for investments. It estimates how much a set of investments might lose, given normal market conditions, in a set time period such as a day. VaR is typically used by firms and regulators in the financial industry to gauge the amount of assets needed to cover possible losses.
The time consistent alternative to the dynamic average value at risk with parameter at time t is defined by
such that .
The dynamic superhedging price is a time consistent risk measure.
The dynamic entropic risk measure is a time consistent risk measure if the risk aversion parameter is constant.
In continuous time, a time consistent coherent risk measure can be given by:
for a sublinear choice of function where denotes a g-expectation. If the function is convex, then the corresponding risk measure is convex.
In mathematics and in particular measure theory, a measurable function is a function between the underlying sets of two measurable spaces that preserves the structure of the spaces: the preimage of any measurable set is measurable, analogous to the definition that a function between topological spaces is continuous if it preserves the topological structure: the preimage of each open set is open. In real analysis, measurable functions are used in the definition of the Lebesgue integral. In probability theory, a measurable function on a probability space is known as a random variable.
In mathematics, mixing is an abstract concept originating from physics: the attempt to describe the irreversible thermodynamic process of mixing in the everyday world: mixing paint, mixing drinks, etc.
In probability theory and statistics, the inverse gamma distribution is a two-parameter family of continuous probability distributions on the positive real line, which is the distribution of the reciprocal of a variable distributed according to the gamma distribution. Perhaps the chief use of the inverse gamma distribution is in Bayesian statistics, where the distribution arises as the marginal posterior distribution for the unknown variance of a normal distribution, if an uninformative prior is used, and as an analytically tractable conjugate prior, if an informative prior is required.(Hoff, 2009:74)
In the fields of actuarial science and financial economics there are a number of ways that risk can be defined; to clarify the concept theoreticians have described a number of properties that a risk measure might or might not have. A coherent risk measure is a function that satisfies properties of monotonicity, sub-additivity, homogeneity, and translational invariance.
Robust optimization is a field of optimization theory that deals with optimization problems in which a certain measure of robustness is sought against uncertainty that can be represented as deterministic variability in the value of the parameters of the problem itself and/or its solution.
A Spectral risk measure is a risk measure given as a weighted average of outcomes where bad outcomes are, typically, included with larger weights. A spectral risk measure is a function of portfolio returns and outputs the amount of the numeraire to be kept in reserve. A spectral risk measure is always a coherent risk measure, but the converse does not always hold. An advantage of spectral measures is the way in which they can be related to risk aversion, and particularly to a utility function, through the weights given to the possible portfolio returns.
In machine learning, a subfield of computer science, learning with errors (LWE) is the problem to infer a linear -ary function over a finite ring from given samples some of which may be erroneous. The LWE problem is conjectured to be hard to solve, and thus be useful in cryptography.
The superhedging price is a coherent risk measure. The superhedging price of a portfolio (A) is equivalent to the smallest amount necessary to be paid for an admissible portfolio (B) at the current time so that at some specified future time the value of B is at least as great as A. In a complete market the superhedging price is equivalent to the price for hedging the initial portfolio.
In financial mathematics, the entropic risk measure is a risk measure which depends on the risk aversion of the user through the exponential utility function. It is a possible alternative to other risk measures as value-at-risk or expected shortfall.
A product distribution is a probability distribution constructed as the distribution of the product of random variables having two other known distributions. Given two statistically independent random variables X and Y, the distribution of the random variable Z that is formed as the product
In financial mathematics, acceptance set is a set of acceptable future net worth which is acceptable to the regulator. It is related to risk measures.
In financial mathematics, a distortion risk measure is a type of risk measure which is related to the cumulative distribution function of the return of a financial portfolio.
In probability theory, the g-expectation is a nonlinear expectation based on a backwards stochastic differential equation (BSDE) originally developed by Shige Peng.
In financial mathematics and stochastic optimization, the concept of risk measure is used to quantify the risk involved in a random outcome or risk position. Many risk measures have hitherto been proposed, each having certain characteristics. The entropic value-at-risk (EVaR) is a coherent risk measure introduced by Ahmadi-Javid, which is an upper bound for the value at risk (VaR) and the conditional value-at-risk (CVaR), obtained from the Chernoff inequality. The EVaR can also be represented by using the concept of relative entropy. Because of its connection with the VaR and the relative entropy, this risk measure is called "entropic value-at-risk". The EVaR was developed to tackle some computational inefficiencies of the CVaR. Getting inspiration from the dual representation of the EVaR, Ahmadi-Javid developed a wide class of coherent risk measures, called g-entropic risk measures. Both the CVaR and the EVaR are members of this class.
Input-to-state stability (ISS) is a stability notion widely used to study stability of nonlinear control systems with external inputs. Roughly speaking, a control system is ISS if it is globally asymptotically stable in the absence of external inputs and if its trajectories are bounded by a function of the size of the input for all sufficiently large times. The importance of ISS is due to the fact, that the concept has bridged the gap between input–output and state-space methods, widely used within the control systems community. The notion of ISS has been introduced by Eduardo Sontag in 1989.
Lagrangian field theory is a formalism in classical field theory. It is the field-theoretic analogue of Lagrangian mechanics. Lagrangian mechanics is used for discrete particles each with a finite number of degrees of freedom. Lagrangian field theory applies to continua and fields, which have an infinite number of degrees of freedom.
The theory of causal fermion systems is an approach to describe fundamental physics. Its proponents claim it gives quantum mechanics, general relativity and quantum field theory as limiting cases and is therefore a candidate for a unified physical theory.
In mathematics, Katugampola fractional operators are integral operators that generalize the Riemann–Liouville and the Hadamard fractional operators into a unique form. The Katugampola fractional integral generalizes both the Riemann–Liouville fractional integral and the Hadamard fractional integral into a single form and It is also closely related to the Erdelyi–Kober operator that generalizes the Riemann–Liouville fractional integral. Katugampola fractional derivative has been defined using the Katugampola fractional integral and as with any other fractional differential operator, it also extends the possibility of taking real number powers or complex number powers of the integral and differential operators.