Conjectural variation

Last updated

In oligopoly theory, conjectural variation is the belief that one firm has an idea about the way its competitors may react if it varies its output or price. The firm forms a conjecture about the variation in the other firm's output that will accompany any change in its own output. For example, in the classic Cournot model of oligopoly, it is assumed that each firm treats the output of the other firms as given when it chooses its output. This is sometimes called the "Nash conjecture," as it underlies the standard Nash equilibrium concept. However, alternative assumptions can be made. Suppose you have two firms producing the same good, so that the industry price is determined by the combined output of the two firms (think of the water duopoly in Cournot's original 1838 account). Now suppose that each firm has what is called the "Bertrand Conjecture" of −1. This means that if firm A increases its output, it conjectures that firm B will reduce its output to exactly offset firm A's increase, so that total output and hence price remains unchanged. With the Bertrand Conjecture, the firms act as if they believe that the market price is unaffected by their own output, because each firm believes that the other firm will adjust its output so that total output will be constant. At the other extreme is the Joint-Profit maximizing conjecture of +1. In this case, each firm believes that the other will imitate exactly any change in output it makes, which leads (with constant marginal cost) to the firms behaving like a single monopoly supplier.

Contents

History

The notion of conjectures has maintained a long history in the Industrial Organization theory ever since the introduction of Conjectural Variations Equilibria by Arthur Bowley in 1924 [1] and Ragnar Frisch (1933) [2] (a useful summary of the history is provided by Giocoli [3] ). Not only are conjectural variations (henceforth CV) models able to capture a range of behavioral outcomes – from competitive to cooperative, but also they have one parameter which has a simple economic interpretation. CV models have also been found quite useful in the empirical analysis of firm behavior in the sense that they provide a more general description of firms behavior than the standard Nash equilibrium.

As Stephen Martin has argued:

There is every reason to believe that oligopolists in different markets interact in different ways, and it is useful to have models that can capture a wide range of such interactions. Conjectural oligopoly models, in any event, have been more useful than game-theoretic oligopoly models in guiding the specification of empirical research in industrial economics. [4]

Consistent conjectures

The CVs of firms determine the slopes of their reaction functions. For example, in the standard Cournot model, the conjecture is of a zero reaction, yet the actual slope of the Cournot reaction function is negative. What happens if we require the actual slope of the reaction function to be equal to the conjecture? Some economists argued that we could pin down the conjectures by a consistency condition, most notably Timothy Bresnahan in 1981. [5] Bresnahan's consistency was a local condition that required the actual slope of the reaction function to be equal to the conjecture at the equilibrium outputs. With linear industry demand and quadratic costs, this gave rise to the result that the consistent conjecture depended on the slope of the marginal cost function: for example, with quadratic costs of the form (see below) cost = a.x2, the consistent conjecture is unique and determined by a. If a=0 then the unique consistent conjecture is the Bertrand conjecture , and as a get bigger, the consistent conjecture increases (becomes less negative) but is always less than zero for finite a.

The concept of consistent conjectures was criticized by several leading economists. [6] [7] Essentially, the concept of consistent conjectures was seen as not compatible with the standard models of rationality employed in Game theory.

However, in the 1990s Evolutionary game theory became fashionable in economics. It was realized that this approach could provide a foundation for the evolution of consistent conjectures. Huw Dixon and Ernesto Somma [8] showed that we could treat the conjecture of a firm as a meme (the cultural equivalent of a gene). They showed that in the standard Cournot model, the consistent conjecture was the Evolutionarily stable strategy or ESS. [9] As the authors argued, "Beliefs determine Behavior. Behavior determines payoff. From an evolutionary perspective, those types of behavior that lead to higher payoffs become more common." In the long run, firms with consistent conjectures would tend to earn bigger profits and come to predominate.

Mathematical example 1: Cournot model with CVs

Let there be two firms, X and Y, with outputs x and y. The market price P is given by the linear demand curve

so that the total revenue of firm X is then

For simplicity, let us follow Cournot's 1838 model and assume that there are no production costs, so that profits equal revenue .

With conjectural variations, the first order condition for the firm becomes:

where is the firms conjecture about how the other firm will respond, the conjectural variation or CV term. This first order optimization condition defines the reaction function for the firm, which states, for a given CV, the optimal choice of output given the other firm's output.

Note that the Cournot-Nash Conjecture is , in which case we have the standard Cournot Reaction function. The CV term serves to shift the reaction function and most importantly later its slope. To solve for a symmetric equilibrium, where both firms have the same CV, we simply note that the reaction function will pass through the x=y line so that:

so that in symmetric equilibrium and the equilibrium price is .

If we have the Cournot-Nash conjecture, , then we have the standard Cournot equilibrium with . However, if we have the Bertrand conjecture , then we obtain the perfectly competitive outcome with price equal to marginal cost (which is zero here). If we assume the joint-profit maximizing conjecture then both firms produce half of the monopoly output and the price is the monopoly price .

Hence the CV term is a simple behavioral parameter which enables us to represent a whole range of possible market outcomes from the competitive to the monopoly outcome, including the standard Cournot model.

Mathematical example 2: Consistency

Take the previous example. Now let the cost of production take the form: cost = a.x2. In this case, the profit function (revenue minus cost) becomes (for firm X and analogously for firm Y):

The first-order condition then becomes:

which defines the reaction function for firm X as:

This has slope (in output space)

and analogously for firm Y which (we assume) has the same conjecture. To see what consistency means, consider the simple Cournot conjecture with constant marginal cost a=0. In this case the slope of the reaction functions is −1/2 which is "inconsistent" with the conjecture. The Bresnehan consistency condition is that the conjectured slope equals the actual slope which means that

This is a quadratic equation which gives us the unique consistent conjecture

This is the positive root of the quadratic: the negative solution would be a conjecture more negative than −1 which would violate the second order conditions. As we can see from this example, when a=0 (marginal cost is horizontal), the Bertrand conjecture is consistent . As the steepness of marginal cost increases (a goes up), the consistent conjecture increases. Note that the consistent conjecture will always be less than 0 for any finite a.

Notes

  1. Bowley, A. L. (1924). The Mathematical Groundwork of Economics, Oxford University Press.
  2. Frisch R. 1951 [1933]. Monopoly – Polypoly – The concept of force in the economy, International Economic Papers, 1, 23–36.
  3. Giacoli N (2005). The escape from conjectural variations: the consistency condition from Bowley to Fellner. Cambridge Journal of Economics, 29, 601–18.
  4. Martin, S. (1993), Advanced Industrial Economics, Blackwells, Oxford. p. 30
  5. Bresnahan T (1981) "Duopoly models with consistent conjectures" American Economic Review, vol 71, pp. 934–945.
  6. Makowsky L (1987) "Are rational conjectures rational, Journal of Industrial Economics, volume 36
  7. Lindh T (1992) The inconsistency of consistent conjectures", Journal of Economic Behavior and Organization, volume 18, pp. 69–80
  8. Dixon H and Somma E, (2003) The evolution of consistent conjectures, Journal of Economic Behaviour and Organization, volume 51, pp. 523–536. Original version (1995) University of York Discussion paper The Evolution of Conjectures
  9. Dixon and Somma (2003), Proposition 1 p. 528, (1995) p. 13.

Related Research Articles

A Duopoly is a type of oligopoly where two firms have dominant or exclusive control over a market. It is the most commonly studied form of oligopoly due to its simplicity. Duopolies sell to consumers in a competitive market where the choice of an individual consumer can not affect the firm. The defining characteristic of both duopolies and oligopolies is that decisions made by sellers are dependent on each other.

An oligopoly can be referred to as a market in which control over an industry lies in the hands of a few large who own a large share of the market. Oligopolistic markets can be described as having homogenous products, few market participants and inelastic demand for the products in those industries. As a result of the significant market power firms tend to have in oligopolistic markets, these firms are exposed to the privilege of influencing prices through manipulating the supply function. in addition to that, these firms can be described as mutually interdependent. This is because any action by one firm is expected to affect other firms in the market and evoke a reaction or consequential action. To remedy that, firms in oligopolistic markets often resort to collusion as means of maximising profits.

<span class="mw-page-title-main">Log-normal distribution</span> Probability distribution

In probability theory, a log-normal (or lognormal) distribution is a continuous probability distribution of a random variable whose logarithm is normally distributed. Thus, if the random variable X is log-normally distributed, then Y = ln(X) has a normal distribution. Equivalently, if Y has a normal distribution, then the exponential function of Y, X = exp(Y), has a log-normal distribution. A random variable which is log-normally distributed takes only positive real values. It is a convenient and useful model for measurements in exact and engineering sciences, as well as medicine, economics and other topics (e.g., energies, concentrations, lengths, prices of financial instruments, and other metrics).

In economics, economic equilibrium is a situation in which economic forces such as supply and demand are balanced and in the absence of external influences the values of economic variables will not change. For example, in the standard text perfect competition, equilibrium occurs at the point at which quantity demanded and quantity supplied are equal.

<span class="mw-page-title-main">Lyapunov stability</span> Property of a dynamical system where solutions near an equilibrium point remain so

Various types of stability may be discussed for the solutions of differential equations or difference equations describing dynamical systems. The most important type is that concerning the stability of solutions near to a point of equilibrium. This may be discussed by the theory of Aleksandr Lyapunov. In simple terms, if the solutions that start out near an equilibrium point stay near forever, then is Lyapunov stable. More strongly, if is Lyapunov stable and all solutions that start out near converge to , then is said to be asymptotically stable. The notion of exponential stability guarantees a minimal rate of decay, i.e., an estimate of how quickly the solutions converge. The idea of Lyapunov stability can be extended to infinite-dimensional manifolds, where it is known as structural stability, which concerns the behavior of different but "nearby" solutions to differential equations. Input-to-state stability (ISS) applies Lyapunov notions to systems with inputs.

Bertrand competition is a model of competition used in economics, named after Joseph Louis François Bertrand (1822–1900). It describes interactions among firms (sellers) that set prices and their customers (buyers) that choose quantities at the prices set. The model was formulated in 1883 by Bertrand in a review of Antoine Augustin Cournot's book Recherches sur les Principes Mathématiques de la Théorie des Richesses (1838) in which Cournot had put forward the Cournot model. Cournot's model argued that each firm should maximise its profit by selecting a quantity level and then adjusting price level to sell that quantity. The outcome of the model equilibrium involved firms pricing above marginal cost; hence, the competitive price. In his review, Bertrand argued that each firm should instead maximise its profits by selecting a price level that undercuts its competitors' prices, when their prices exceed marginal cost. The model was not formalized by Bertrand; however, the idea was developed into a mathematical model by Francis Ysidro Edgeworth in 1889.

Cournot competition is an economic model used to describe an industry structure in which companies compete on the amount of output they will produce, which they decide on independently of each other and at the same time. It is named after Antoine Augustin Cournot (1801–1877) who was inspired by observing competition in a spring water duopoly. It has the following features:

The Stackelberg leadership model is a strategic game in economics in which the leader firm moves first and then the follower firms move sequentially. It is named after the German economist Heinrich Freiherr von Stackelberg who published Market Structure and Equilibrium in 1934, which described the model. In game theory terms, the players of this game are a leader and a follower and they compete on quantity. The Stackelberg leader is sometimes referred to as the Market Leader.

In statistics, a probit model is a type of regression where the dependent variable can take only two values, for example married or not married. The word is a portmanteau, coming from probability + unit. The purpose of the model is to estimate the probability that an observation with particular characteristics will fall into a specific one of the categories; moreover, classifying observations based on their predicted probabilities is a type of binary classification model.

Nonlinear control theory is the area of control theory which deals with systems that are nonlinear, time-variant, or both. Control theory is an interdisciplinary branch of engineering and mathematics that is concerned with the behavior of dynamical systems with inputs, and how to modify the output by changes in the input using feedback, feedforward, or signal filtering. The system to be controlled is called the "plant". One way to make the output of a system follow a desired reference signal is to compare the output of the plant to the desired output, and provide feedback to the plant to modify the output to bring it closer to the desired output.

<span class="mw-page-title-main">Hopf bifurcation</span> Critical point where a periodic solution arises

In the mathematical theory of bifurcations, a Hopfbifurcation is a critical point where a system's stability switches and a periodic solution arises. More accurately, it is a local bifurcation in which a fixed point of a dynamical system loses stability, as a pair of complex conjugate eigenvalues—of the linearization around the fixed point—crosses the complex plane imaginary axis. Under reasonably generic assumptions about the dynamical system, a small-amplitude limit cycle branches from the fixed point.

Discriminative models, also referred to as conditional models, are a class of logistical models used for classification or regression. They distinguish decision boundaries through observed data, such as pass/fail, win/lose, alive/dead or healthy/sick.

In economics and game theory, the decisions of two or more players are called strategic complements if they mutually reinforce one another, and they are called strategic substitutes if they mutually offset one another. These terms were originally coined by Bulow, Geanakoplos, and Klemperer (1985).

In linear elasticity, the equations describing the deformation of an elastic body subject only to surface forces on the boundary are the equilibrium equation:

<span class="mw-page-title-main">Mild-slope equation</span> Physics phenomenon and formula

In fluid dynamics, the mild-slope equation describes the combined effects of diffraction and refraction for water waves propagating over bathymetry and due to lateral boundaries—like breakwaters and coastlines. It is an approximate model, deriving its name from being originally developed for wave propagation over mild slopes of the sea floor. The mild-slope equation is often used in coastal engineering to compute the wave-field changes near harbours and coasts.

Congestion games are a class of games in game theory first proposed by American economist Robert W. Rosenthal in 1973. In a congestion game the payoff of each player depends on the resources it chooses and the number of players choosing the same resource. Congestion games are a special case of potential games. Rosenthal proved that any congestion game is a potential game and Monderer and Shapley (1996) proved the converse: for any potential game, there is a congestion game with the same potential function.

The Brander–Spencer model is an economic model in international trade originally developed by James Brander and Barbara Spencer in the early 1980s. The model illustrates a situation where, under certain assumptions, a government can subsidize domestic firms to help them in their competition against foreign producers and in doing so enhances national welfare. This conclusion stands in contrast to results from most international trade models, in which government non-interference is socially optimal.

The Dryden wind turbulence model, also known as Dryden gusts, is a mathematical model of continuous gusts accepted for use by the United States Department of Defense in certain aircraft design and simulation applications. The Dryden model treats the linear and angular velocity components of continuous gusts as spatially varying stochastic processes and specifies each component's power spectral density. The Dryden wind turbulence model is characterized by rational power spectral densities, so exact filters can be designed that take white noise inputs and output stochastic processes with the Dryden gusts' power spectral densities.

In econometrics, the truncated normal hurdle model is a variant of the Tobit model and was first proposed by Cragg in 1971.

<span class="mw-page-title-main">Variational autoencoder</span> Deep learning generative model to encode data representation

In machine learning, a variational autoencoder (VAE), is an artificial neural network architecture introduced by Diederik P. Kingma and Max Welling, belonging to the families of probabilistic graphical models and variational Bayesian methods.