Random walk hypothesis

Last updated

The random walk hypothesis is a financial theory stating that stock market prices evolve according to a random walk (so price changes are random) and thus cannot be predicted.

Contents

History

The concept can be traced to French broker Jules Regnault who published a book in 1863, and then to French mathematician Louis Bachelier whose Ph.D. dissertation titled "The Theory of Speculation" (1900) included some remarkable insights and commentary. The same ideas were later developed by MIT Sloan School of Management professor Paul Cootner in his 1964 book The Random Character of Stock Market Prices. [1] The term was popularized by the 1973 book A Random Walk Down Wall Street by Burton Malkiel, a professor of economics at Princeton University, [2] and was used earlier in Eugene Fama's 1965 article "Random Walks In Stock Market Prices", [3] which was a less technical version of his Ph.D. thesis. The theory that stock prices move randomly was earlier proposed by Maurice Kendall in his 1953 paper, The Analysis of Economic Time Series, Part 1: Prices. [4] In 1993 in the Journal of Econometrics, K. Victor Chow and Karen C. Denning publish a statistical tool (known as the Chow–Denning test) for checking whether a market follows the random walk hypothesis. [5]

Testing the hypothesis

Random walk hypothesis test by increasing or decreasing the value of a fictitious stock based on the odd/even value of the decimals of pi. The chart resembles a stock chart. Pi stock.svg
Random walk hypothesis test by increasing or decreasing the value of a fictitious stock based on the odd/even value of the decimals of pi. The chart resembles a stock chart.

Whether financial data are a random walk is a venerable and challenging question. One of two possible results are obtained, data are random walk or the data are not. To investigate whether observed data follow a random walk, some methods or approaches have been proposed, for example, the variance ratio (VR) tests, [6] the Hurst exponent [7] and surrogate data testing. [8]

Burton G. Malkiel, an economics professor at Princeton University and author of A Random Walk Down Wall Street, performed a test where his students were given a hypothetical stock that was initially worth fifty dollars. The closing stock price for each day was determined by a coin flip. If the result was heads, the price would close a half point higher, but if the result was tails, it would close a half point lower. Thus, each time, the price had a fifty-fifty chance of closing higher or lower than the previous day. Cycles or trends were determined from the tests. Malkiel then took the results in chart and graph form to a chartist, a person who "seeks to predict future movements by seeking to interpret past patterns on the assumption that 'history tends to repeat itself'." [9] The chartist told Malkiel that they needed to immediately buy the stock. Since the coin flips were random, the fictitious stock had no overall trend. Malkiel argued that this indicates that the market and stocks could be just as random as flipping a coin.

Asset pricing with a random walk

Modelling asset prices with a random walk takes the form:

where

is a drift constant

is the standard deviation of the returns

is the change in time

is an i.i.d. random variable satisfying .

A non-random walk hypothesis

There are other economists, professors, and investors who believe that the market is predictable to some degree. These people believe that prices may move in trends and that the study of past prices can be used to forecast future price direction.[ clarification needed Confusing Random and Independence?] There have been some economic studies that support this view, and a book has been written by two professors of economics that tries to prove the random walk hypothesis wrong. [10]

Martin Weber, a leading researcher in behavioural finance, has performed many tests and studies on finding trends in the stock market. In one of his key studies, he observed the stock market for ten years. Throughout that period, he looked at the market prices for noticeable trends and found that stocks with high price increases in the first five years tended to become under-performers in the following five years. Weber and other believers in the non-random walk hypothesis cite this as a key contributor and contradictor to the random walk hypothesis. [11]

Another test that Weber ran that contradicts the random walk hypothesis, was finding stocks that have had an upward revision for earnings outperform other stocks in the following six months. With this knowledge, investors can have an edge in predicting what stocks to pull out of the market and which stocks — the stocks with the upward revision — to leave in. Martin Weber’s studies detract from the random walk hypothesis, because according to Weber, there are trends and other tips to predicting the stock market.

Professors Andrew W. Lo and Archie Craig MacKinlay, professors of Finance at the MIT Sloan School of Management and the University of Pennsylvania, respectively, have also presented evidence that they believe shows the random walk hypothesis to be wrong. Their book A Non-Random Walk Down Wall Street, presents a number of tests and studies that reportedly support the view that there are trends in the stock market and that the stock market is somewhat predictable. [12]

One element of their evidence is the simple volatility-based specification test, which has a null hypothesis that states:

where

is the log of the price of the asset at time
is a drift constant
is a random disturbance term where and for (this implies that and are independent since ).

To refute the hypothesis, they compare the variance of for different and compare the results to what would be expected for uncorrelated . [12] Lo and MacKinlay have authored a paper, the adaptive market hypothesis, which puts forth another way of looking at the predictability of price changes. [13]

Peter Lynch, a mutual fund manager at Fidelity Investments, has argued that the random walk hypothesis is contradictory to the efficient market hypothesis -- though both concepts are widely taught in business schools without seeming awareness of a contradiction. If asset prices are rational and based on all available data as the efficient market hypothesis proposes, then fluctuations in asset price are not random. But if the random walk hypothesis is valid then asset prices are not rational as the efficient market hypothesis proposes. [14]

Related Research Articles

<span class="mw-page-title-main">Autocorrelation</span> Correlation of a signal with a time-shifted copy of itself, as a function of shift

Autocorrelation, sometimes known as serial correlation in the discrete time case, is the correlation of a signal with a delayed copy of itself as a function of delay. Informally, it is the similarity between observations of a random variable as a function of the time lag between them. The analysis of autocorrelation is a mathematical tool for finding repeating patterns, such as the presence of a periodic signal obscured by noise, or identifying the missing fundamental frequency in a signal implied by its harmonic frequencies. It is often used in signal processing for analyzing functions or series of values, such as time domain signals.

<span class="mw-page-title-main">Normal distribution</span> Probability distribution

In statistics, a normal distribution or Gaussian distribution is a type of continuous probability distribution for a real-valued random variable. The general form of its probability density function is

<span class="mw-page-title-main">Technical analysis</span> Security analysis methodology

In finance, technical analysis is an analysis methodology for analysing and forecasting the direction of prices through the study of past market data, primarily price and volume. As a type of active management, it stands in contradiction to much of modern portfolio theory. The efficacy of technical analysis is disputed by the efficient-market hypothesis, which states that stock market prices are essentially unpredictable, and research on whether technical analysis offers any benefit has produced mixed results. It is distinguished from fundamental analysis, which considers a company's financial statements, health, and the overall state of the market and economy.

<span class="mw-page-title-main">Law of large numbers</span> Averages of repeated trials converge to the expected value

In probability theory, the law of large numbers (LLN) is a theorem that describes the result of performing the same experiment a large number of times. According to the law, the average of the results obtained from a large number of trials should be close to the expected value and tends to become closer to the expected value as more trials are performed.

<span class="mw-page-title-main">Efficient-market hypothesis</span> Economic theory that asset prices fully reflect all available information

The efficient-market hypothesis (EMH) is a hypothesis in financial economics that states that asset prices reflect all available information. A direct implication is that it is impossible to "beat the market" consistently on a risk-adjusted basis since market prices should only react to new information.

In mathematics, a bialgebra over a field K is a vector space over K which is both a unital associative algebra and a counital coassociative coalgebra. The algebraic and coalgebraic structures are made compatible with a few more axioms. Specifically, the comultiplication and the counit are both unital algebra homomorphisms, or equivalently, the multiplication and the unit of the algebra both are coalgebra morphisms.

Analysis of covariance (ANCOVA) is a general linear model which blends ANOVA and regression. ANCOVA evaluates whether the means of a dependent variable (DV) are equal across levels of one or more categorical independent variables (IV) and across one or more continuous variables. For example, the categorical variable(s) might describe treatment and the continuous variable(s) might be covariates or nuisance variables; or vice versa. Mathematically, ANCOVA decomposes the variance in the DV into variance explained by the CV(s), variance explained by the categorical IV, and residual variance. Intuitively, ANCOVA can be thought of as 'adjusting' the DV by the group means of the CV(s).

In econometrics, the autoregressive conditional heteroskedasticity (ARCH) model is a statistical model for time series data that describes the variance of the current error term or innovation as a function of the actual sizes of the previous time periods' error terms; often the variance is related to the squares of the previous innovations. The ARCH model is appropriate when the error variance in a time series follows an autoregressive (AR) model; if an autoregressive moving average (ARMA) model is assumed for the error variance, the model is a generalized autoregressive conditional heteroskedasticity (GARCH) model.

<span class="mw-page-title-main">Cross-correlation</span> Covariance and correlation

In signal processing, cross-correlation is a measure of similarity of two series as a function of the displacement of one relative to the other. This is also known as a sliding dot product or sliding inner-product. It is commonly used for searching a long signal for a shorter, known feature. It has applications in pattern recognition, single particle analysis, electron tomography, averaging, cryptanalysis, and neurophysiology. The cross-correlation is similar in nature to the convolution of two functions. In an autocorrelation, which is the cross-correlation of a signal with itself, there will always be a peak at a lag of zero, and its size will be the signal energy.

In finance, arbitrage pricing theory (APT) is a multi-factor model for asset pricing which relates various macro-economic (systematic) risk variables to the pricing of financial assets. Proposed by economist Stephen Ross in 1976, it is widely believed to be an improved alternative to its predecessor, the Capital Asset Pricing Model (CAPM). APT is founded upon the law of one price, which suggests that within an equilibrium market, rational investors will implement arbitrage such that the equilibrium price is eventually realised. As such, APT argues that when opportunities for arbitrage are exhausted in a given period, then the expected return of an asset is a linear function of various factors or theoretical market indices, where sensitivities of each factor is represented by a factor-specific beta coefficient or factor loading. Consequently, it provides traders with an indication of ‘true’ asset value and enables exploitation of market discrepancies via arbitrage. The linear factor model structure of the APT is used as the basis for evaluating asset allocation, the performance of managed funds as well as the calculation of cost of capital. Furthermore, the newer APT model is more dynamic being utilised in more theoretical application than the preceding CAPM model. A 1986 article written by Gregory Connor and Robert Korajczyk, utilised the APT framework and applied it to portfolio performance measurement suggesting that the Jensen coefficient is an acceptable measurement of portfolio performance.

<span class="mw-page-title-main">Financial contagion</span> Scenario in which financial shocks spread to other financial sectors

Financial contagion refers to "the spread of market disturbances – mostly on the downside – from one country to the other, a process observed through co-movements in exchange rates, stock prices, sovereign spreads, and capital flows". Financial contagion can be a potential risk for countries who are trying to integrate their financial system with international financial markets and institutions. It helps explain an economic crisis extending across neighboring countries, or even regions.

In semiconductor physics, the Haynes–Shockley experiment was an experiment that demonstrated that diffusion of minority carriers in a semiconductor could result in a current. The experiment was reported in a short paper by Haynes and Shockley in 1948, with a more detailed version published by Shockley, Pearson, and Haynes in 1949. The experiment can be used to measure carrier mobility, carrier lifetime, and diffusion coefficient.

In probability theory and statistics, the normal-gamma distribution is a bivariate four-parameter family of continuous probability distributions. It is the conjugate prior of a normal distribution with unknown mean and precision.

In statistics, one-way analysis of variance is a technique to compare whether two samples' means are significantly different. This analysis of variance technique requires a numeric response variable "Y" and a single explanatory variable "X", hence "one-way".

<span class="mw-page-title-main">Black–Scholes equation</span> Partial differential equation in mathematical finance

In mathematical finance, the Black–Scholes equation is a partial differential equation (PDE) governing the price evolution of a European call or European put under the Black–Scholes model. Broadly speaking, the term may refer to a similar PDE that can be derived for a variety of options, or more generally, derivatives.

A Sommerfeld expansion is an approximation method developed by Arnold Sommerfeld for a certain class of integrals which are common in condensed matter and statistical physics. Physically, the integrals represent statistical averages using the Fermi–Dirac distribution.

In mathematics, the infinity Laplace operator is a 2nd-order partial differential operator, commonly abbreviated . It is alternately defined by

Financial correlations measure the relationship between the changes of two or more financial variables over time. For example, the prices of equity stocks and fixed interest bonds often move in opposite directions: when investors sell stocks, they often use the proceeds to buy bonds and vice versa. In this case, stock and bond prices are negatively correlated.

Stochastic portfolio theory (SPT) is a mathematical theory for analyzing stock market structure and portfolio behavior introduced by E. Robert Fernholz in 2002. It is descriptive as opposed to normative, and is consistent with the observed behavior of actual markets. Normative assumptions, which serve as a basis for earlier theories like modern portfolio theory (MPT) and the capital asset pricing model (CAPM), are absent from SPT.

In statistics, expected mean squares (EMS) are the expected values of certain statistics arising in partitions of sums of squares in the analysis of variance (ANOVA). They can be used for ascertaining which statistic should appear in the denominator in an F-test for testing a null hypothesis that a particular effect is absent.

References

  1. Cootner, Paul H. (1964). The random character of stock market prices. MIT Press. ISBN   978-0-262-03009-0.
  2. Malkiel, Burton G. (1973). A Random Walk Down Wall Street (6th ed.). W.W. Norton & Company, Inc. ISBN   978-0-393-06245-8.
  3. Fama, Eugene F. (September–October 1965). "Random Walks In Stock Market Prices". Financial Analysts Journal. 21 (5): 55–59. doi:10.2469/faj.v21.n5.55 . Retrieved 2008-03-21.
  4. Kendall, M. G.; Bradford Hill, A (1953). "The Analysis of Economic Time-Series-Part I: Prices". Journal of the Royal Statistical Society. A (General). 116 (1): 11–34. doi:10.2307/2980947. JSTOR   2980947.
  5. Chow, K.Victor; Denning, Karen C. (August 1993). "A simple multiple variance ratio test". Journal of Econometrics. 58 (3): 385–401. doi:10.1016/0304-4076(93)90051-6.
  6. A.W. Lo; A.C. MacKinlay (1989). "The size and power of the variance ratio test in finite samples: a Monte Carlo investigation". Journal of Econometrics. 40: 203–238. doi:10.1016/0304-4076(89)90083-3.
  7. Jens Feder (1988). Fractals. Springer. ISBN   9780306428517.
  8. T. Nakamura; M. Small (2007). "Tests of the random walk hypothesis for financial data". Physica A. 377 (2): 599–615. Bibcode:2007PhyA..377..599N. doi:10.1016/j.physa.2006.10.073.
  9. Keane, Simon M. (1983). Stock Market Efficiency. Philip Allan Limited. ISBN   978-0-86003-619-7.
  10. Lo, Andrew (1999). A Non-Random Walk Down Wall Street . Princeton University Press. ISBN   978-0-691-05774-3.
  11. Fromlet, Hubert (July 2001). "Behavioral Finance-Theory and Practical Application". Business Economics: 63.
  12. 1 2 Lo, Andrew W.; Mackinlay, Archie Craig (2002). A Non-Random Walk Down Wall Street (5th ed.). Princeton University Press. pp. 4–47. ISBN   978-0-691-09256-0.
  13. Lo, Andrew W. "The adaptive markets hypothesis: Market efficiency from an evolutionary perspective." Journal of Portfolio Management, Forthcoming (2004).
  14. Lynch, Peter (1989). One Up On Wall Street . New York, NY: Simon & Schuster Paperback. ISBN   978-0-671-66103-8.