Information cascade

Last updated

An information cascade or informational cascade is a phenomenon described in behavioral economics and network theory in which a number of people make the same decision in a sequential fashion. It is similar to, but distinct from herd behavior. [1] [2] [3]

Contents

An information cascade is generally accepted as a two-step process. For a cascade to begin an individual must encounter a scenario with a decision, typically a binary one. Second, outside factors can influence this decision, such as the individual observing others' choices and the apparent outcomes.

The two-step process of an informational cascade can be broken down into five basic components:

  1. There is a decision to be made – for example; whether to adopt a new technology, wear a new style of clothing, eat in a new restaurant, or support a particular political position
  2. A limited action space exists (e.g. an adopt/reject decision)
  3. People make the decision sequentially, and each person can observe the choices made by those who acted earlier
  4. Each person has some information aside from their own that helps guide their decision
  5. A person can't directly observe the outside information that other people know, but he or she can make inferences about this information from what they do

Social perspectives of cascades, which suggest that agents may act irrationally (e.g., against what they think is optimal) when social pressures are great, exist as complements to the concept of information cascades. [4] More often the problem is that the concept of an information cascade is confused with ideas that do not match the two key conditions of the process, such as social proof, information diffusion, [5] and social influence. Indeed, the term information cascade has even been used to refer to such processes. [6]

Basic model

This section provides some basic examples of information cascades, as originally described by Bikchandani et al. (1992). [7] The basic model has since been developed in a variety of directions to examine its robustness and better understand its implications. [8] [9]

Qualitative example

Information cascades occur when external information obtained from previous participants in an event overrides one's own private signal, irrespective of the correctness of the former over the latter. The experiment conducted by Anderson [10] is a useful example of this process. The experiment consisted of two urns labeled A and B. Urn A contains two balls labeled "a" and one labeled "b". Urn B contains one ball labeled "a" and two labeled "b". The urn from which a ball must be drawn during each run is determined randomly and with equal probabilities (from the throw of a dice). The contents of the chosen urn are emptied into a neutral container. The participants are then asked in random order to draw a marble from this container. This entire process may be termed a "run", and a number of such runs are performed.

Each time a participant picks up a marble, he is to decide which urn it belongs to. His decision is then announced for the benefit of the remaining participants in the room. Thus, the (n+1)th participant has information about the decisions made by all the n participants preceding him, and also his private signal which is the label on the ball that he draws during his turn. The experimenters observed that an information cascade was observed in 41 of 56 such runs. This means, in the runs where the cascade occurred, at least one participant gave precedence to earlier decisions over his own private signal. It is possible for such an occurrence to produce the wrong result. This phenomenon is known as "Reverse Cascade".

Quantitative description

A person's signal telling them to accept is denoted as H (a high signal, where high signifies he should accept), and a signal telling them not to accept is L (a low signal). The model assumes that when the correct decision is to accept, individuals will be more likely to see an H, and conversely, when the correct decision is to reject, individuals are more likely to see an L signal. This is essentially a conditional probability – the probability of H when the correct action is to accept, or . Similarly is the probability that an agent gets an L signal when the correct action is reject. If these likelihoods are represented by q, then q > 0.5. This is summarized in the table below. [11]

Agent signalTrue probability state
RejectAccept
Lq1-q
H1-qq

The first agent determines whether or not to accept solely based on his own signal. As the model assumes that all agents act rationally, the action (accept or reject) the agent feels is more likely is the action he will choose to take. This decision can be explained using Bayes' rule:

If the agent receives an H signal, then the likelihood of accepting is obtained by calculating . The equation says that, by virtue of the fact that q > 0.5, the first agent, acting only on his private signal, will always increase his estimate of p with an H signal. Similarly, it can be shown that an agent will always decrease his expectation of p when he receives a low signal. Recalling that, if the value, V, of accepting is equal to the value of rejecting, then an agent will accept if he believes p > 0.5, and reject otherwise. Because this agent started out with the assumption that both accepting and rejecting are equally viable options (p = 0.5), the observation of an H signal will allow him to conclude that accepting is the rational choice.

The second agent then considers both the first agent's decision and his own signal, again in a rational fashion. In general, the nth agent considers the decisions of the previous n-1 agents, and his own signal. He makes a decision based on Bayesian reasoning to determine the most rational choice.

Where a is the number of accepts in the previous set plus the agent's own signal, and b is the number of rejects. Thus, . The decision is based on how the value on the right hand side of the equation compares with p. [11]

Explicit model assumptions

The original model makes several assumptions about human behavior and the world in which humans act, [7] some of which are relaxed in later versions [11] or in alternate definitions of similar problems, such as the diffusion of innovations.

  1. Boundedly Rational Agents: The original Independent Cascade model assumes humans are boundedly rational [12] – that is, they will always make rational decisions based on the information they can observe, but the information they observe may not be complete or correct. In other words, agents do not have complete knowledge of the world around them (which would allow them to make the correct decision in any and all situations). In this way, there is a point at which, even if a person has correct knowledge of the idea or action cascading, they can be convinced via social pressures to adopt some alternate, incorrect view of the world.
  2. Incomplete Knowledge of Others: The original information cascade model assumes that agents have incomplete knowledge of the agents which precede them in the specified order. As opposed to definitions where agents have some knowledge of the "private information" held by previous agents, the current agent makes a decision based only on the observable action (whether or not to imitate) of those preceding him. It is important to note that the original creators argue this is a reason why information cascades can be caused by small shocks.
  3. Behavior of all previous agents is known

Resulting conditions

  1. Cascades will always occur – as discussed, in the simple mode, the likelihood of a cascade occurring increases towards 1 as the number of people making decisions increases towards infinity.
  2. Cascades can be incorrect – because agents make decisions with both bounded rationality and probabilistic knowledge of the initial truth (e.g. whether accepting or rejecting is the correct decision), the incorrect behavior may cascade through the system.
  3. Cascades can be based on little information – mathematically, a cascade of an infinite length can occur based only on the decision of two people. More generally, a small set of people who strongly promote an idea as being rational can rapidly influence a much larger subset of the general population
  4. Cascades are fragile – because agents receive no extra information after the difference between a and b increases beyond 2, and because such differences can occur at small numbers of agents, agents considering opinions from those agents who are making decisions based on actual information can be dissuaded from a choice rather easily. [7] This suggests that cascades are susceptible to the release of public information. [7] also discusses this result in the context of the underlying value p changing over time, in which case a cascade can rapidly change course.

Responding

A literature exists that examines how individuals or firms might respond to the existence of informational cascades when they have products to sell but where buyers are unsure of the quality of those products. Curtis Taylor (1999) [13] shows that when selling a house the seller might wish to start with high prices, as failure to sell with low prices is indicative of low quality and might start a cascade on not buying, while failure to sell with high prices could be construed as meaning the house is just over-priced, and prices can then be reduced to get a sale. Daniel Sgroi (2002) [14] shows that firms might use "guinea pigs" who are given the opportunity to buy early to kick-start an informational cascade through their early and public purchasing decisions, and work by David Gill and Daniel Sgroi (2008) [15] show that early public tests might have a similar effect (and in particular that passing a "tough test" which is biased against the seller can instigate a cascade all by itself). Bose et al. [16] have examined how prices set by a monopolist might evolve in the presence of potential cascade behavior where the monopolist and consumers are unsure of a products quality.

Examples and fields of application

Information cascades occur in situations where seeing many people make the same choice provides evidence that outweighs one's own judgment. That is, one thinks: "It's more likely that I'm wrong than that all those other people are wrong. Therefore, I will do as they do."

In what has been termed a reputational cascade, late responders sometimes go along with the decisions of early responders, not just because the late responders think the early responders are right, but also because they perceive their reputation will be damaged if they dissent from the early responders. [17]

Market cascades

Information cascades have become one of the topics of behavioral economics, as they are often seen in financial markets where they can feed speculation and create cumulative and excessive price moves, either for the whole market (market bubble) or a specific asset, like a stock that becomes overly popular among investors.[ citation needed ]

Marketers also use the idea of cascades to attempt to get a buying cascade started for a new product. If they can induce an initial set of people to adopt the new product, then those who make purchasing decisions later on may also adopt the product even if it is no better than, or perhaps even worse than, competing products. This is most effective if these later consumers are able to observe the adoption decisions, but not how satisfied the early customers actually were with the choice. This is consistent with the idea that cascades arise naturally when people can see what others do but not what they know.[ citation needed ]

An example is Hollywood movies. If test screenings suggest a big-budget movie might be a flop, studios often decide to spend more on initial marketing rather than less, with the aim of making as much money as possible on the opening weekend, before word gets around that it's a turkey.[ citation needed ]

Information cascades are usually considered by economists:[ citation needed ]

Social networks and social media

Dotey et al. [18] state that information flows in the form of cascades on the social network. According to the authors, analysis of virality of information cascades on a social network may lead to many useful applications like determining the most influential individuals within a network. This information can be used for maximizing market effectiveness or influencing public opinion . Various structural and temporal features of a network affect cascade virality. Additionally, these models are widely exploited in the problem of Rumor spread in social network to investigate it and reduce its influence in online social networks.

In contrast to work on information cascades in social networks, the social influence model of belief spread argues that people have some notion of the private beliefs of those in their network. [19] The social influence model, then, relaxes the assumption of information cascades that people are acting only on observable actions taken by others. In addition, the social influence model focuses on embedding people within a social network, as opposed to a queue. Finally, the social influence model relaxes the assumption of the information cascade model that people will either complete an action or not by allowing for a continuous scale of the "strength" of an agents belief that an action should be completed.

Information cascades can also restructure the social networks that they pass through. For example, while there is a constant low level of churn in social ties on Twitter—in any given month, about 9% of all social connections change—there is often a spike in follow and unfollow activity following an information cascade, such as the sharing of a viral tweet. [20] As the tweet-sharing cascade passes through the network, users adjust their social ties, particularly those connected to the original author of the viral tweet: the author of a viral tweet will see both a sudden loss in previous followers and a sudden increase in new followers.

As a part of this cascade-driven reorganization process, information cascades can also create assortative social networks, where people tend to be connected to others who are similar in some characteristic. Tweet cascades increase in the similarity between connected users, as users lose ties to more dissimilar users and add new ties to similar users. [20] Information cascades created by news coverage in the media may also foster political polarization by sorting social networks along political lines: Twitter users who follow and share more polarized news coverage tend to lose social ties to users of the opposite ideology. [21]

Historical examples

Empirical studies

In addition to the examples above, Information Cascades have been shown to exist in several empirical studies. Perhaps the best example, given above, is. [10] Participants stood in a line behind an urn which had balls of different colors. Sequentially, participants would pick a ball out of the urn, looks at it, and then places it back into the urn. The agent then voices their opinion of which color of balls (red or blue) there is a majority of in the urn for the rest of the participants to hear. Participants get a monetary reward if they guess correctly, forcing the concept of rationality.

Other examples include

The negative effects of informational cascades sometimes become a legal concern and laws have been enacted to neutralize them. Ward Farnsworth, a law professor, analyzed the legal aspects of informational cascades and gave several examples in his book The Legal Analyst: in many military courts, the officers voting to decide a case vote in reverse rank order (the officer of the lowest rank votes first), and he suggested it may be done so the lower-ranked officers would not be tempted by the cascade to vote with the more senior officers, who are believed to have more accurate judgement; another example is that countries such as Israel and France have laws that prohibit polling days or weeks before elections to prevent the effect of informational cascade that may influence the election results. [27]

Globalization

One informational cascade study compared thought processes between Greek and German organic farmers, suggesting discrepancies based upon cultural and socioeconomic differences. [28] Even further, cascades have been extrapolated to ideas such as financial volatility and monetary policy. In 2004 Helmut Wagner and Wolfram Berger suggested cascades as an analytical vehicle to examine changes to the financial market as it became more globalized. Wagner and Berger noticed structural changes to the framework of understanding financial markets due to globalization; giving rise to volatility in capital flow and spawning uncertainty which affected central banks. [29] Additionally, information cascades are useful in understanding the origins of terrorist tactics. When the attack by Black September occurred in 1972 it was hard not to see the similarities between their tactics and the Baader-Meinhof group (also known as the Red Army Faction [RAF]). [30] All of these examples portray how the process of cascades were put into use. Moreover, it is important to understand the framework of cascades to move forward in a more globalized society. Establishing a foundation to understanding the passage of information through transnational and multinational organizations, and even more, is critical to the arising modern society. [31] Summing up all of these points, cascades, as a general term, encompass a spectrum of different concepts. Information cascades have been the underlying thread in how information is transferred, overwritten, and understood through various cultures spanning from a multitude of different countries. [32]

See also

Related Research Articles

<span class="mw-page-title-main">Rational choice model</span> Sociological theory

The rational choice model, also called rational choice theory refers to a set of guidelines that help understand economic and social behaviour. The theory originated in the eighteenth century and can be traced back to the political economist and philosopher Adam Smith. The theory postulates that an individual will perform a cost–benefit analysis to determine whether an option is right for them. Rational choice theory looks at three concepts: rational actors, self interest and the invisible hand.

Bounded rationality is the idea that rationality is limited when individuals make decisions, and under these limitations, rational individuals will select a decision that is satisfactory rather than optimal.

The bandwagon effect is a psychological phenomenon where people adopt certain behaviors, styles, or attitudes simply because others are doing so. More specifically, it is a cognitive bias by which public opinion or behaviours can alter due to particular actions and beliefs rallying amongst the public. It is a psychological phenomenon whereby the rate of uptake of beliefs, ideas, fads and trends increases with respect to the proportion of others who have already done so. As more people come to believe in something, others also "hop on the bandwagon" regardless of the underlying evidence.

<span class="mw-page-title-main">Behavioral economics</span> Academic discipline

Behavioral economics is the study of the psychological factors involved in the decisions of individuals or institutions, and how these decisions deviate from those implied by traditional economic theory.

<span class="mw-page-title-main">Prospect theory</span> Theory of behavioral economics

Prospect theory is a theory of behavioral economics, judgment and decision making that was developed by Daniel Kahneman and Amos Tversky in 1979. The theory was cited in the decision to award Kahneman the 2002 Nobel Memorial Prize in Economics.

From a legal point of view, a contract is an institutional arrangement for the way in which resources flow, which defines the various relationships between the parties to a transaction or limits the rights and obligations of the parties.

Moral reasoning is the study of how people think about right and wrong and how they acquire and apply moral rules. It is a subdiscipline of moral psychology that overlaps with moral philosophy, and is the foundation of descriptive ethics.

Social influence comprises the ways in which individuals adjust their behavior to meet the demands of a social environment. It takes many forms and can be seen in conformity, socialization, peer pressure, obedience, leadership, persuasion, sales, and marketing. Typically social influence results from a specific action, command, or request, but people also alter their attitudes and behaviors in response to what they perceive others might do or think. In 1958, Harvard psychologist Herbert Kelman identified three broad varieties of social influence.

  1. Compliance is when people appear to agree with others but actually keep their dissenting opinions private.
  2. Identification is when people are influenced by someone who is liked and respected, such as a famous celebrity.
  3. Internalization is when people accept a belief or behavior and agree both publicly and privately.

Social proof is a psychological and social phenomenon wherein people copy the actions of others in choosing how to behave in a given situation. The term was coined by Robert Cialdini in his 1984 book Influence: Science and Practice.

<span class="mw-page-title-main">Ultimatum game</span> Game in economic experiments

The ultimatum game is a game that has become a popular instrument of economic experiments. An early description is by Nobel laureate John Harsanyi in 1961. One player, the proposer, is endowed with a sum of money. The proposer is tasked with splitting it with another player, the responder. Once the proposer communicates their decision, the responder may accept it or reject it. If the responder accepts, the money is split per the proposal; if the responder rejects, both players receive nothing. Both players know in advance the consequences of the responder accepting or rejecting the offer.

<span class="mw-page-title-main">Diffusion of innovations</span> Theory on how and why new ideas spread

Diffusion of innovations is a theory that seeks to explain how, why, and at what rate new ideas and technology spread. The theory was popularized by Everett Rogers in his book Diffusion of Innovations, first published in 1962. Rogers argues that diffusion is the process by which an innovation is communicated through certain channels over time among the participants in a social system. The origins of the diffusion of innovations theory are varied and span multiple disciplines.

Herd mentality is the tendency for people’s behavior or beliefs to conform to those of the group they belong to. The concept of herd mentality has been studied and analyzed from different perspectives, including biology, psychology and sociology. This psychological phenomenon can have profound impacts on human behavior.

Hobart Peyton Young is an American game theorist and economist known for his contributions to evolutionary game theory and its application to the study of institutional and technological change, as well as the theory of learning in games. He is currently centennial professor at the London School of Economics, James Meade Professor of Economics Emeritus at the University of Oxford, professorial fellow at Nuffield College Oxford, and research principal at the Office of Financial Research at the U.S. Department of the Treasury.

Quantal response equilibrium (QRE) is a solution concept in game theory. First introduced by Richard McKelvey and Thomas Palfrey, it provides an equilibrium notion with bounded rationality. QRE is not an equilibrium refinement, and it can give significantly different results from Nash equilibrium. QRE is only defined for games with discrete strategies, although there are continuous-strategy analogues.

In decision theory, on making decisions under uncertainty—should information about the best course of action arrive after taking a fixed decision—the human emotional response of regret is often experienced, and can be measured as the value of difference between a made decision and the optimal decision.

Herd behavior is the behavior of individuals in a group acting collectively without centralized direction. Herd behavior occurs in animals in herds, packs, bird flocks, fish schools and so on, as well as in humans. Voting, demonstrations, riots, general strikes, sporting events, religious gatherings, everyday decision-making, judgement and opinion-forming, are all forms of human-based herd behavior.

Conformity is the act of matching attitudes, beliefs, and behaviors to group norms, politics or being like-minded. Norms are implicit, specific rules, guidance shared by a group of individuals, that guide their interactions with others. People often choose to conform to society rather than to pursue personal desires – because it is often easier to follow the path others have made already, rather than forging a new one. Thus, conformity is sometimes a product of group communication. This tendency to conform occurs in small groups and/or in society as a whole and may result from subtle unconscious influences, or from direct and overt social pressure. Conformity can occur in the presence of others, or when an individual is alone. For example, people tend to follow social norms when eating or when watching television, even if alone.

Complex contagion is the phenomenon in social networks in which multiple sources of exposure to an innovation are required before an individual adopts the change of behavior. It differs from simple contagion in that unlike a disease, it may not be possible for the innovation to spread after only one incident of contact with an infected neighbor. The spread of complex contagion across a network of people may depend on many social and economic factors; for instance, how many of one's friends adopt the new idea as well as how many of them cannot influence the individual, as well as their own disposition in embracing change.

Cognitive bias mitigation is the prevention and reduction of the negative effects of cognitive biases – unconscious, automatic influences on human judgment and decision making that reliably produce reasoning errors.

Behavioral game theory seeks to examine how people's strategic decision-making behavior is shaped by social preferences, social utility and other psychological factors. Behavioral game theory analyzes interactive strategic decisions and behavior using the methods of game theory, experimental economics, and experimental psychology. Experiments include testing deviations from typical simplifications of economic theory such as the independence axiom and neglect of altruism, fairness, and framing effects. As a research program, the subject is a development of the last three decades.

References

  1. Duan, Wenjing; Gu, Bin; Whinston, Andrew B. (March 2009). "Informational Cascades and Software Adoption on the Internet: An Empirical Investigation". MIS Quarterly. 33 (1). Rochester, NY: 23–48. doi:10.2307/20650277. hdl: 2144/42029 . JSTOR   20650277. S2CID   909115. SSRN   1103165.
  2. "The Difference Between Information Cascades and Herd Behavior : Networks Course blog for INFO 2040/CS 2850/Econ 2040/SOC 2090" . Retrieved 2019-04-15.
  3. Çelen, Boğaçhan; Kariv, Shachar (May 2004). "Distinguishing Informational Cascades from Herd Behavior in the Laboratory". American Economic Review. 94 (3): 484–498. CiteSeerX   10.1.1.357.3265 . doi:10.1257/0002828041464461.
  4. Schiller, R.J. (1995). "Conversation, Information and Herd Behavior". Rhetoric and Economic Behavior. 85 (3): 181–185.
  5. Gruhl, Daniel; Guha, R.; Liben-Nowell, David; Tomkins, Andrew (2004). "Information diffusion through blogspace". Proceedings of the 13th international conference on World Wide Web. pp. 491–501. CiteSeerX   10.1.1.131.4532 . doi:10.1145/988672.988739. ISBN   978-1-58113-844-3. S2CID   526158.
  6. Sadikov, Eldar; Medina, Montserrat; Leskovec, Jure; Garcia-Molina, Hector (2011). "Correcting for missing data in information cascades". Proceedings of the fourth ACM international conference on Web search and data mining. pp. 55–64. doi:10.1145/1935826.1935844. ISBN   978-1-4503-0493-1. S2CID   6978077.
  7. 1 2 3 4 Bikhchandani, Sushil; Hirshleifer, David; Welch, Ivo (October 1992). "A Theory of Fads, Fashion, Custom, and Cultural Change as Informational Cascades" (PDF). Journal of Political Economy. 100 (5): 992–1026. doi:10.1086/261849. S2CID   7784814.
  8. Bikhchandani, Sushil; Hirshleifer, David; Welch, Ivo (August 1998). "Learning from the Behavior of Others: Conformity, Fads, and Informational Cascades". Journal of Economic Perspectives. 12 (3): 151–170. doi: 10.1257/jep.12.3.151 . hdl: 2027.42/35413 .
  9. Smith, Lones; Sorensen, Peter (March 2000). "Pathological Outcomes of Observational Learning". Econometrica. 68 (2): 371–398. doi:10.1111/1468-0262.00113. hdl: 1721.1/64049 . S2CID   14414203.
  10. 1 2 Anderson, Lisa R.; Holt, Charles A. (1997). "Information Cascades in the Laboratory". The American Economic Review. 87 (5): 847–862. JSTOR   2951328.
  11. 1 2 3 Easley, David (2010). Networks, Crowds and Markets: Reasoning about a Highly Connected World. Cambridge University Press. pp. 483–506.
  12. Newell, A. (1972). Human problem solving . Englewood Cliffs, NY: Prentice Hall. ISBN   9780134454030.
  13. Taylor, C. (1999). "Time-on-the-Market as a Sign of Quality". Review of Economic Studies. 66 (3): 555–578. doi: 10.1111/1467-937x.00098 .
  14. Sgroi, D. (2002). "Optimizing Information in the Herd: Guinea Pigs, Profits, and Welfare" (PDF). Games and Economic Behavior. 39: 137–166. doi:10.1006/game.2001.0881.
  15. Gill, D.; D. Sgroi (2008). "Sequential Decisions with Tests". Games and Economic Behavior. 63 (2): 663–678. CiteSeerX   10.1.1.322.7566 . doi:10.1016/j.geb.2006.07.004. S2CID   5793119.
  16. Bose, S.; G. Orosel; M. Ottaviani; L. Vesterlund (2006). "Dynamic Monopoly Pricing and Herding". RAND Journal of Economics. 37 (4): 910–928. CiteSeerX   10.1.1.493.1834 . doi:10.1111/j.1756-2171.2006.tb00063.x. S2CID   2984643.
  17. Lemieux, Pierre (22 December 2003). "Following the herd: why do some ideas suddenly become popular, and then die out just as quickly?". Regulation. 26 (4): 16–22. SSRN   505764. Gale   A113304115.
  18. Dotey, A., Rom, H. and Vaca C. (2011). "Information Diffusion in Social Media" (PDF). Stanford University.{{cite web}}: CS1 maint: multiple names: authors list (link)
  19. Friedkin, Noah E.; Johnsen, Eugene C. (2009). Social Influence Network Theory. Cambridge: Cambridge University Press. doi:10.1017/cbo9780511976735. ISBN   978-0-511-97673-5.
  20. 1 2 Myers, Seth A.; Leskovec, Jure (2014). "The bursty dynamics of the Twitter information network". Proceedings of the 23rd international conference on World wide web. pp. 913–924. arXiv: 1403.2732 . doi:10.1145/2566486.2568043. ISBN   978-1-4503-2744-2. S2CID   6353961.
  21. Tokita, Christopher K.; Guess, Andrew M.; Tarnita, Corina E. (14 December 2021). "Polarized information ecosystems can reorganize social networks via information cascades". Proceedings of the National Academy of Sciences. 118 (50): e2102147118. Bibcode:2021PNAS..11802147T. doi: 10.1073/pnas.2102147118 . PMC   8685718 . PMID   34876511.
  22. 1 2 3 4 Shirky, Clay (2008). Here Comes Everybody: The Power of Organizing Without Organizations. New York: Penguin Press. pp.  161–164. ISBN   978-1-59420-153-0.
  23. Carboneau, Clark (2005). "Using Diffusion of Innovations and Academic Detailing to Spread Evidence-based Practices". Journal for Healthcare Quality . 27 (2): 48–52. doi:10.1111/j.1945-1474.2005.tb01117.x. PMID   16190312. S2CID   6946662.
  24. Beal, George M.; Bohlen, Joe M. (November 1981). "The Diffusion Process" (PDF). Special Report No. 18. Iowa State University of Science and Technology of Ames, Iowa. Archived from the original (PDF) on 2009-04-08. Retrieved 2008-11-11.
  25. De Vany, A.; D. Walls (1999). "Uncertainty in the movie industry: does star power reduce the terror of the box office?". Journal of Cultural Economics. 23 (4): 285–318. doi:10.1023/a:1007608125988. S2CID   54614446.
  26. Walden, Eric; Browne, Glenn (2002). "Information Cascades in the Adoption of New Technology". ICIS Proceedings.
  27. Farnsworth, Ward (2007). The Legal Analyst: A Toolkit for Thinking about the Law. University of Chicago Press. ISBN   978-0-226-23835-7. OCLC   76828864.[ page needed ]
  28. Chatzimichael, Konstantinos; Genius, Margarita; Tzouvelekas, Vangelis (December 2014). "Informational cascades and technology adoption: Evidence from Greek and German organic growers" (PDF). Food Policy. 49: 186–195. doi:10.1016/j.foodpol.2014.08.001.
  29. Wagner, Helmut; Berger, Wolfram (June 2004). "Globalization, Financial Volatility and Monetary Policy". Empirica. 31 (2–3): 163–184. CiteSeerX   10.1.1.466.2938 . doi:10.1007/s10633-004-0915-4. S2CID   53471608.
  30. Passmore, L. (2011). Ulrike Meinhof and the Red Army Faction: Performing Terrorism. Springer. ISBN   978-0-230-37077-7. OCLC   904285976.[ page needed ]
  31. Hamlett, Patrick W.; Cobb, Michael D. (November 2006). "Potential Solutions to Public Deliberation Problems: Structured Deliberations and Polarization Cascades". Policy Studies Journal. 34 (4): 629–648. doi:10.1111/j.1541-0072.2006.00195.x.
  32. Drezner, Daniel W. (2010). "Weighing the Scales: The Internet's Effect On State-Society Relations". The Brown Journal of World Affairs. 16 (2): 31–44. JSTOR   24590907.