Echo chamber (media)

Last updated

An echo chamber is "an environment where a person only encounters information or opinions that reflect and reinforce their own." FilterBubble.jpg
An echo chamber is "an environment where a person only encounters information or opinions that reflect and reinforce their own."

In news media and social media, an echo chamber is an environment or ecosystem in which participants encounter beliefs that amplify or reinforce their preexisting beliefs by communication and repetition inside a closed system and insulated from rebuttal. [2] [3] [4] An echo chamber circulates existing views without encountering opposing views, potentially resulting in confirmation bias. Echo chambers may increase social and political polarization and extremism. [5] On social media, it is thought that echo chambers limit exposure to diverse perspectives, and favor and reinforce presupposed narratives and ideologies. [4] [6]

Contents

The term is a metaphor based on an acoustic echo chamber, in which sounds reverberate in a hollow enclosure. Another emerging term for this echoing and homogenizing effect within social-media communities on the Internet is neotribalism.

Many scholars note the effects that echo chambers can have on citizens' stances and viewpoints, and specifically implications has for politics. [7] However, some studies have suggested that the effects of echo chambers are weaker than often assumed. [8]

Concept

The Internet has expanded the variety and amount of accessible political information. On the positive side, this may create a more pluralistic form of public debate; on the negative side, greater access to information may lead to selective exposure to ideologically supportive channels. [5] In an extreme "echo chamber", one purveyor of information will make a claim, which many like-minded people then repeat, overhear, and repeat again (often in an exaggerated or otherwise distorted form) [9] until most people assume that some extreme variation of the story is true. [10]

The echo chamber effect occurs online when a harmonious group of people amalgamate and develop tunnel vision. Participants in online discussions may find their opinions constantly echoed back to them, which reinforces their individual belief systems due to the declining exposure to other's opinions. [11] Their individual belief systems are what culminate into a confirmation bias regarding a variety of subjects. When an individual wants something to be true, they often will only gather the information that supports their existing beliefs and disregard any statements they find that are contradictory or speak negatively upon their beliefs. [12] Individuals who participate in echo chambers often do so because they feel more confident that their opinions will be more readily accepted by others in the echo chamber. [13] This happens because the Internet has provided access to a wide range of readily available information. People are receiving their news online more rapidly through less traditional sources, such as Facebook, Google, and Twitter. These and many other social platforms and online media outlets have established personalized algorithms intended to cater specific information to individuals’ online feeds. This method of curating content has replaced the function of the traditional news editor. [14] The mediated spread of information through online networks causes a risk of an algorithmic filter bubble, leading to concern regarding how the effects of echo chambers on the internet promote the division of online interaction. [15]

It is important to note that members of an echo chamber are not fully responsible for their convictions. Once part of an echo chamber, an individual might adhere to seemingly acceptable epistemic practices and still be further misled. Many individuals may be stuck in echo chambers due to factors existing outside of their control, such as being raised in one. [3]

Furthermore, the function of an echo chamber does not entail eroding a member's interest in truth; it focuses upon manipulating their credibility levels so that fundamentally different establishments and institutions will be considered proper sources of authority. [16]

Empirical research

However, empirical findings to clearly support these concerns are needed [17] and the field is very fragmented when it comes to empirical results. There are some studies that do measure echo chamber effects, such as the study of Bakshy et al. (2015). [18] [19] In this study the researchers found that people tend to share news articles they align with. Similarly, they discovered a homophily in online friendships, meaning people are more likely to be connected on social media if they have the same political ideology. In combination, this can lead to echo chamber effects. Bakshy et al. found that a person's potential exposure to cross-cutting content (content that is opposite to their own political beliefs) through their own network is only 24% for liberals and 35% for conservatives. Other studies argue that expressing cross-cutting content is an important measure of echo chambers: Bossetta et al. (2023) find that 29% of Facebook comments during Brexit were cross-cutting expressions. [20] Therefore, echo chambers might be present in a person's media diet but not in how they interact with others on social media.

Echo chamber dynamics in social media as a two-step process. The first is "seeding" in which malicious actors insert misinformation into the public sphere, and second is "echoing" when people circulate it as part of their beliefs and identity. Disinformation and echo chambers.jpg
Echo chamber dynamics in social media as a two-step process. The first is "seeding" in which malicious actors insert misinformation into the public sphere, and second is “echoing” when people circulate it as part of their beliefs and identity.

Another set of studies suggests that echo chambers exist, but that these are not a widespread phenomenon: Based on survey data, Dubois and Blank (2018) show that most people do consume news from various sources, while around 8% consume media with low diversity. [21] Similarly, Rusche (2022) shows that, most Twitter users do not show behavior that resembles that of an echo chamber. However, through high levels of online activity, the small group of users that do, make up a substantial share populist politicians' followers, thus creating homogeneous online spaces. [22]

Finally, there are other studies which contradict the existence of echo chambers. Some found that people also share news reports that don't align with their political beliefs. [23] Others found that people using social media are being exposed to more diverse sources than people not using social media. [24] In summation, it remains that clear and distinct findings are absent which either confirm or falsify the concerns of echo chamber effects.

Research on the social dynamics of echo chambers shows that the fragmented nature of online culture, the importance of collective identity construction, and the argumentative nature of online controversies can generate echo chambers where participants encounter self-reinforcing beliefs. [2] Researchers show that echo chambers are prime vehicles to disseminate disinformation, as participants exploit contradictions against perceived opponents amidst identity-driven controversies. [2]

Some scholars instead emphasize the impact of echo chambers on identity and emotion, rather than on opinions and beliefs, showing that echo chambers can contribute to political tribalism. [25]

Difficulties of researching processes

Echo chamber studies fail to achieve consistent and comparable results due to unclear definitions, inconsistent measurement methods, and unrepresentative data. [26] Social media platforms continually change their algorithms, and most studies are conducted in the US, limiting their application to political systems with more parties.

Echo chambers vs epistemic bubbles

In recent years, closed epistemic networks have increasingly been held responsible for the era of post-truth and fake news. [27] However, the media frequently conflates two distinct concepts of social epistemology: echo chambers and epistemic bubbles. [16]

An epistemic bubble is an informational network in which important sources have been excluded by omission, perhaps unintentionally. It is an impaired epistemic framework which lacks strong connectivity. [28] Members within epistemic bubbles are unaware of significant information and reasoning.

On the other hand, an echo chamber is an epistemic construct in which voices are actively excluded and discredited. It does not suffer from a lack in connectivity; rather it depends on a manipulation of trust by methodically discrediting all outside sources. [29] According to research conducted by the University of Pennsylvania, members of echo chambers become dependent on the sources within the chamber and highly resistant to any external sources. [30]

An important distinction exists in the strength of the respective epistemic structures. Epistemic bubbles are not particularly robust. Relevant information has merely been left out, not discredited. [31] One can ‘pop’ an epistemic bubble by exposing a member to the information and sources that they have been missing. [3]

Echo chambers, however, are incredibly strong. By creating pre-emptive distrust between members and non-members, insiders will be insulated from the validity of counter-evidence and will continue to reinforce the chamber in the form of a closed loop. [29] Outside voices are heard, but dismissed.

As such, the two concepts are fundamentally distinct and cannot be utilized interchangeably. However, one must note that this distinction is conceptual in nature, and an epistemic community can exercise multiple methods of exclusion to varying extents.

Similar concepts

A filter bubble – a term coined by internet activist Eli Pariser – is a state of intellectual isolation that allegedly can result from personalized searches when a website algorithm selectively guesses what information a user would like to see based on information about the user, such as location, past click-behavior and search history. As a result, users become separated from information that disagrees with their viewpoints, effectively isolating them in their own cultural or ideological bubbles. The choices made by these algorithms are not transparent.

Homophily is the tendency of individuals to associate and bond with similar others, as in the proverb "birds of a feather flock together". The presence of homophily has been detected in a vast array of network studies. For example, a study conducted by Bakshy et. al. explored the data of 10.1 million Facebook users. These users identified as either politically liberal, moderate, or conservative, and the vast majority of their friends were found to have a political orientation that was similar to their own. Facebook algorithms recognize this and selects information with a bias towards this political orientation to showcase in their newsfeed. [32]

Recommender systems are information filtering systems put in place on different platforms that provide recommendations depending on information gathered from the user. In general, recommendations are provided in three different ways: based on content that was previously selected by the user, content that has similar properties or characteristics to that which has been previously selected by the user, or a combination of both. [32]

Both echo chambers and filter bubbles relate to the ways individuals are exposed to content devoid of clashing opinions, and colloquially might be used interchangeably. However, echo chamber refers to the overall phenomenon by which individuals are exposed only to information from like-minded individuals, while filter bubbles are a result of algorithms that choose content based on previous online behavior, as with search histories or online shopping activity. [18] Indeed, specific combinations of homophily and recommender systems have been identified as significant drivers for determining the emergence of echo chambers. [33]

Culture wars are cultural conflicts between social groups that have conflicting values and beliefs. It refers to "hot button" topics on which societal polarization occurs. [34] A culture war is defined as "the phenomenon in which multiple groups of people, who hold entrenched values and ideologies, attempt to contentiously steer public policy." [2] Echo chambers on social media have been identified as playing a role on how multiple social groups, holding distinct values and ideologies, create groups circulate conversations through conflict and controversy.

Implications of echo chambers

Online communities

Social network diagram displaying users forming separate, distinct clusters Sna large.png
Social network diagram displaying users forming separate, distinct clusters

Online social communities become fragmented by echo chambers when like-minded people group together and members hear arguments in one specific direction with no counter argument addressed. In certain online platforms, such as Twitter, echo chambers are more likely to be found when the topic is more political in nature compared to topics that are seen as more neutral. [35] Social networking communities are communities that are considered to be some of the most powerful reinforcements of rumors [36] due to the trust in the evidence supplied by their own social group and peers, over the information circulating the news. [37] [38] In addition to this, the reduction of fear that users can enjoy through projecting their views on the internet versus face-to-face allows for further engagement in agreement with their peers. [39]

This can create significant barriers to critical discourse within an online medium. Social discussion and sharing can potentially suffer when people have a narrow information base and do not reach outside their network. Essentially, the filter bubble can distort one's reality in ways which are not believed to be alterable by outside sources. [40]

Findings by Tokita et al. (2021) suggest that individuals’ behavior within echo chambers may dampen their access to information even from desirable sources. In highly polarized information environments, individuals who are highly reactive to socially-shared information are more likely than their less reactive counterparts to curate politically homogenous information environments and experience decreased information diffusion in order to avoid overreacting to news they deem unimportant. This makes these individuals more likely to develop extreme opinions and to overestimate the degree to which they are informed. [41]

Research has also shown that misinformation can become more viral as a result of echo chambers, as the echo chambers provide an initial seed which can fuel broader viral diffusion. [42]

Offline communities

Many offline communities are also segregated by political beliefs and cultural views. The echo chamber effect may prevent individuals from noticing changes in language and culture involving groups other than their own. Online echo chambers can sometimes influence an individual's willingness to participate in similar discussions offline. A 2016 study found that "Twitter users who felt their audience on Twitter agreed with their opinion were more willing to speak out on that issue in the workplace". [13]

Group polarization can occur as a result of growing echo chambers. The lack of external viewpoints and the presence of a majority of individuals sharing a similar opinion or narrative can lead to a more extreme belief set. Group polarisation can also aid the current of fake news and misinformation through social media platforms. [43] This can extend to offline interactions, with data revealing that offline interactions can be as polarising as online interactions (Twitter), arguably due to social media-enabled debates being highly fragmented. [44]

Examples

Echo chambers have existed in many forms. Examples cited since the late 20th century include:

Since the creation of the internet, scholars have been curious to see the changes in political communication. [56] Due to the new changes in information technology and how it is managed, it is unclear how opposing perspectives can reach common ground in a democracy. [57] The effects seen from the echo chamber effect has largely been cited to occur in politics, such as Twitter [58] and Facebook during the 2016 presidential election in the United States. [19] Some believe that echo chambers played a big part in the success of Donald Trump in the 2016 presidential elections. [59]

Countermeasures

From media companies

Some companies have also made efforts in combating the effects of an echo chamber on an algorithmic approach. A high-profile example of this is the changes Facebook made to its "Trending" page, which is an on-site news source for its users. Facebook modified their "Trending" page by transitioning from displaying a single news source to multiple news sources for a topic or event. [60] The intended purpose of this was to expand the breadth of news sources for any given headline, and therefore expose readers to a variety of viewpoints. There are startups building apps with the mission of encouraging users to open their echo chambers, such as UnFound.news. [61] Another example is a beta feature on BuzzFeed News called "Outside Your Bubble", [62] which adds a module to the bottom of BuzzFeed News articles to show reactions from various platforms like Twitter, Facebook, and Reddit. This concept aims to bring transparency and prevent biased conversations, diversifying the viewpoints their readers are exposed to. [63]

See also

Related Research Articles

Media bias occurs when journalists and news producers show bias in how they report and cover news. The term "media bias" implies a pervasive or widespread bias contravening of the standards of journalism, rather than the perspective of an individual journalist or article. The direction and degree of media bias in various countries is widely disputed.

Political polarization is the divergence of political attitudes away from the center, towards ideological extremes.

<span class="mw-page-title-main">Homophily</span> Process by which people befriend similar people

Homophily is a concept in sociology describing the tendency of individuals to associate and bond with similar others, as in the proverb "birds of a feather flock together". The presence of homophily has been discovered in a vast array of network studies: over 100 studies have observed homophily in some form or another, and they establish that similarity is associated with connection. The categories on which homophily occurs include age, gender, class, and organizational role.

Misinformation is incorrect or misleading information. It differs from disinformation, which is deliberately deceptive and propagated information. Early definitions of misinformation focused on statements that were patently false, incorrect, or not factual. Therefore, a narrow definition of misinformation refers to the information's quality, whether inaccurate, incomplete, or false. However, recent studies define misinformation per deception rather than informational accuracy because misinformation can include falsehoods, selective truths, and half-truths.

Self-propaganda is the way in which people convince themselves of something regardless of the evidence against it. They will go over their side of the argument without considering the alternative arguments.

<span class="mw-page-title-main">Social media</span> Virtual online communities

Social media are interactive technologies that facilitate the creation, sharing and aggregation of content, ideas, interests, and other forms of expression through virtual communities and networks. Social media refers to new forms of media that involve interactive participation. While challenges to the definition of social media arise due to the variety of stand-alone and built-in social media services currently available, there are some common features:

  1. Social media apps are online platforms that enable users to create and share content and participate in social networking.
  2. User-generated content—such as text posts or comments, digital photos or videos, and data generated through all online interactions—is the lifeblood of social media.
  3. Users create service-specific profiles for the website or app that are designed and maintained by the social media organization.
  4. Social media helps the development of online social networks by connecting a user's profile with those of other individuals or groups.

Personalized search is a web search tailored specifically to an individual's interests by incorporating information about the individual beyond the specific query provided. There are two general approaches to personalizing search results, involving modifying the user's query and re-ranking search results.

<span class="mw-page-title-main">Filter bubble</span> Intellectual isolation involving search engines

A filter bubble or ideological frame is a state of intellectual isolation that can result from personalized searches, recommendation systems, and algorithmic curation. The search results are based on information about the user, such as their location, past click-behavior, and search history. Consequently, users become separated from information that disagrees with their viewpoints, effectively isolating them in their own cultural or ideological bubbles, resulting in a limited and customized view of the world. The choices made by these algorithms are only sometimes transparent. Prime examples include Google Personalized Search results and Facebook's personalized news-stream.

<span class="mw-page-title-main">User profile</span> Data about an individual user

A user profile is a collection of settings and information associated with a user. It contains critical information that is used to identify an individual, such as their name, age, portrait photograph and individual characteristics such as knowledge or expertise. User profiles are most commonly present on social media websites such as Facebook, Instagram, and LinkedIn; and serve as voluntary digital identity of an individual, highlighting their key features and traits. In personal computing and operating systems, user profiles serve to categorise files, settings, and documents by individual user environments, known as ‘accounts’, allowing the operating system to be more friendly and catered to the user. Physical user profiles serve as identity documents such as passports, driving licenses and legal documents that are used to identify an individual under the legal system.

Social media mining is the process of obtaining big data from user-generated content on social media sites and mobile apps in order to extract actionable patterns, form conclusions about users, and act upon the information, often for the purpose of advertising to users or conducting research. The term is an analogy to the resource extraction process of mining for rare minerals. Resource extraction mining requires mining companies to shift through vast quantities of raw ore to find the precious minerals; likewise, social media mining requires human data analysts and automated software programs to shift through massive amounts of raw social media data in order to discern patterns and trends relating to social media usage, online behaviours, sharing of content, connections between individuals, online buying behaviour, and more. These patterns and trends are of interest to companies, governments and not-for-profit organizations, as these organizations can use these patterns and trends to design their strategies or introduce new programs, new products, processes or services.

Social media and political communication in the United States refers to how political institutions, politicians, private entities, and the general public use social media platforms to communicate and interact in the United States.

A social bot, also described as a social AI or social algorithm, is a software agent that communicates autonomously on social media. The messages it distributes can be simple and operate in groups and various configurations with partial human control (hybrid) via algorithm. Social bots can also use artificial intelligence and machine learning to express messages in more natural human dialogue.

Internet manipulation is the co-optation of online digital technologies, including algorithms, social bots, and automated scripts, for commercial, social, military, or political purposes. Internet and social media manipulation are the prime vehicles for spreading disinformation due to the importance of digital platforms for media consumption and everyday communication. When employed for political purposes, internet manipulation may be used to steer public opinion, polarise citizens, circulate conspiracy theories, and silence political dissidents. Internet manipulation can also be done for profit, for instance, to harm corporate or political adversaries and improve brand reputation. Internet manipulation is sometimes also used to describe the selective enforcement of Internet censorship or selective violations of net neutrality.

The social influence bias is an asymmetric herding effect on online social media platforms which makes users overcompensate for negative ratings but amplify positive ones. Positive social influence can accumulate and result in a rating bubble, while negative social influence is neutralized by crowd correction. This phenomenon was first described in a paper written by Lev Muchnik, Sinan Aral and Sean J. Taylor in 2014, then the question was revisited by Cicognani et al., whose experiment reinforced Munchnik's and his co-authors' results.

Online youth radicalization is the action in which a young individual or a group of people come to adopt increasingly extreme political, social, or religious ideals and aspirations that reject, or undermine the status quo or undermine contemporary ideas and expressions of a state, which they may or may not reside in. Online youth radicalization can be both violent or non-violent.

Social media use in politics refers to the use of online social media platforms in political processes and activities. Political processes and activities include all activities that pertain to the governance of a country or area. This includes political organization, global politics, political corruption, political parties, and political values. The media's primary duty is to present us with information and alert us when events occur. This information may affect what we think and the actions we take. The media can also place pressure on the government to act by signaling a need for intervention or showing that citizens want change.

A digital platform is a software-based online infrastructure that facilitates user interactions and transactions.

Algorithmic radicalization is the concept that recommender algorithms on popular social media sites such as YouTube and Facebook drive users toward progressively more extreme content over time, leading to them developing radicalized extremist political views. Algorithms record user interactions, from likes/dislikes to amount of time spent on posts, to generate endless media aimed to keep users engaged. Through echo chamber channels, the consumer is driven to be more polarized through preferences in media and self-confirmation.

<span class="mw-page-title-main">Alt-right pipeline</span> Online radicalization process

The alt-right pipeline is a proposed conceptual model regarding internet radicalization toward the alt-right movement. It describes a phenomenon in which consuming provocative right-wing political content, such as antifeminist or anti-SJW ideas, gradually increases exposure to the alt-right or similar far-right politics. It posits that this interaction takes place due to the interconnected nature of political commentators and online communities, allowing members of one audience or community to discover more extreme groups. This process is most commonly associated with and has been documented on the video platform YouTube, and is largely faceted by the method in which algorithms on various social media platforms function through the process recommending content that is similar to what users engage with, but can quickly lead users down rabbit-holes.

Rage farming or rage-baiting is internet slang that refers to a manipulative tactic to elicit outrage with the goal of increasing internet traffic, online engagement, revenue and support. Rage baiting or farming can be used as a tool to increase engagement, attract subscribers, followers, and supporters, which can be financially lucrative. Rage baiting and rage farming manipulates users to respond in kind to offensive, inflammatory headlines, memes, tropes, or comments.

References

  1. "echo-chamber noun - Definition, pictures, pronunciation and usage notes | Oxford Advanced Learner's Dictionary at OxfordLearnersDictionaries.com". www.oxfordlearnersdictionaries.com. Retrieved 25 April 2020.
  2. 1 2 3 4 5 6 Diaz Ruiz, Carlos; Nilsson, Tomas (2023). "Disinformation and Echo Chambers: How Disinformation Circulates in Social Media Through Identity-Driven Controversies". Journal of Public Policy & Marketing. 4 (1): 18–35. doi:10.1177/07439156221103852. S2CID   248934562.
  3. 1 2 3 Nguyen, C. Thi (June 2020). "Echo Chambers and Epistemic Bubbles". Episteme. 17 (2): 141–161. doi:10.1017/epi.2018.32. ISSN   1742-3600. S2CID   171520109.
  4. 1 2 Cinelli, Matteo; De Francisci Morales, Gianmarco; Galeazzi, Alessandro; Quattrociocchi, Walter; Starnini, Michele (23 February 2021). "The echo chamber effect on social media". Proceedings of the National Academy of Sciences. 118 (9). Bibcode:2021PNAS..11823301C. doi: 10.1073/pnas.2023301118 . ISSN   0027-8424. PMC   7936330 . PMID   33622786.
  5. 1 2 Barberá, Pablo, et al. (21 August 2015). "Tweeting from left to right: Is online political communication more than an echo chamber?". Psychological Science . 26.10: 1531-1542. doi : 10.1177/0956797615594620
  6. Currin, Christopher Brian; Vera, Sebastián Vallejo; Khaledi-Nasab, Ali (2 June 2022). "Depolarization of echo chambers by random dynamical nudge". Scientific Reports. 12 (1): 9234. arXiv: 2101.04079 . Bibcode:2022NatSR..12.9234C. doi:10.1038/s41598-022-12494-w. ISSN   2045-2322. PMC   9163087 . PMID   35654942.
  7. Unver, H. Akin (2017). "Politics of Automation, Attention, and Engagement". Journal of International Affairs. 71 (1): 127–146. ISSN   0022-197X. JSTOR   26494368.
  8. Gentzkow, Matthew; Shapiro, Jesse M. (November 2011). "Ideological Segregation Online and Offline *" (PDF). The Quarterly Journal of Economics. 126 (4): 1799–1839. doi:10.1093/qje/qjr044. hdl: 1811/52901 . ISSN   0033-5533. S2CID   9303073.
  9. Parry, Robert (28 December 2006). "The GOP's $3 Bn Propaganda Organ". The Baltimore Chronicle. Retrieved 6 March 2008.
  10. "SourceWatch entry on media "Echo Chamber" effect". SourceWatch. 22 October 2006. Retrieved 3 February 2008.
  11. Mutz, Diana C. (2006). Hearing the Other Side. Cambridge: Cambridge University Press. doi:10.1017/cbo9780511617201. ISBN   978-0-511-61720-1.
  12. Heshmat, Shahram (23 April 2015). "What Is Confirmation Bias?". Psychology Today. Retrieved 25 April 2020.
  13. 1 2 Hampton, Keith N.; Shin, Inyoung; Lu, Weixu (3 July 2017). "Social media and political discussion: when online presence silences offline conversation". Information, Communication & Society. 20 (7): 1090–1107. doi: 10.1080/1369118x.2016.1218526 . ISSN   1369-118X.
  14. Hosanagar, Kartik (25 November 2016). "Blame the Echo Chamber on Facebook. But Blame Yourself, Too". Wired . Retrieved 24 September 2017.
  15. Ulen, Thomas S. (2001). "Democracy and the Internet: Cass R. Sunstein, Republic.Com. Princeton, NJ. Princeton University Press. Pp. 224. 2001". SSRN Working Paper Series. doi:10.2139/ssrn.286293. ISSN   1556-5068.
  16. 1 2 "The Reason Your Feed Became An Echo Chamber — And What To Do About It". NPR.org. Retrieved 12 June 2020.
  17. Dahlgren, Peter M. (2020). Media Echo Chambers: Selective Exposure and Confirmation Bias in Media Use, and its Consequences for Political Polarization. Gothenburg: University of Gothenburg. ISBN   978-91-88212-95-5.
  18. 1 2 Bakshy, Eytan; Messing, Solomon; Adamic, Lada A. (5 June 2015). "Exposure to ideologically diverse news and opinion on Facebook". Science. 348 (6239): 1130–1132. Bibcode:2015Sci...348.1130B. doi: 10.1126/science.aaa1160 . ISSN   0036-8075. PMID   25953820. S2CID   206632821.
  19. 1 2 El-Bermawy, Mostafa (18 November 2016). "Your Filter Bubble is Destroying Democracy". Wired.
  20. Bossetta, Michael; Dutceac Segesten, Anamaria; Bonacci, Duje (22 June 2023). "Reconceptualizing Cross-Cutting Political Expression on Social Media: A Case Study of Facebook Comments During the 2016 Brexit Referendum". Political Communication. 40 (6): 719–741. doi: 10.1080/10584609.2023.2222370 . ISSN   1058-4609. S2CID   259634530.
  21. Dubois, Elizabeth; Blank, Grant (2018). "The echo chamber is overstated: the moderating effect of political interest and diverse media". Information, Communication & Society. 21 (5): 729–745. doi: 10.1080/1369118X.2018.1428656 . S2CID   149369522.
  22. Rusche, Felix (2022). "Few voices, strong echo: Measuring follower homogeneity of politicians' Twitter accounts". New Media & Society. doi: 10.1177/14614448221099860 . S2CID   249902124.
  23. Morgan, Jonathan Scott; Lampe, Cliff; Shafiq, Muhammad Zubair (2013). "Is news sharing on Twitter ideologically biased?". Proceedings of the 2013 conference on Computer supported cooperative work. pp. 887–896. doi:10.1145/2441776.2441877. ISBN   9781450313315. S2CID   9415443.
  24. Levy, David; Fletcher, Richard; Kalogeropoulos, Antonis; Newman, Nic; Nielsen, Rasmus Kleis (June 2017). "Reuters Institute Digital News Report 2017" (PDF). Digital News Report. Oxford: Reuters Institute for the Study of Journalism: 42–43. Retrieved 24 May 2021.
  25. Törnberg, A.; Törnberg, P. (2024). Intimate Communities of Hate: Why Social Media Fuels Far-Right Extremism. Taylor & Francis. ISBN   978-1-040-00493-7.
  26. Gray, Peter; Johnson, Steven L.; Kitchens, Brent (December 2020). "Understanding Echo Chambers and Filter Bubbles: The Impact of Social Media On Diversification and Partisan Shifts in News Consumption". MIS Quarterly. 44 (4): 1619–1649. doi:10.25300/MISQ/2020/16371. ISSN   0276-7783. S2CID   229294134.
  27. Robson, David. "The myth of the online echo chamber". www.bbc.com. Retrieved 12 June 2020.
  28. Magnani, Lorenzo; Bertolotti, Tommaso (2011). "Cognitive Bubbles and Firewalls: Epistemic Immunizations in Human Reasoning". Proceedings of the Annual Meeting of the Cognitive Science Society. 33 (33). ISSN   1069-7977.
  29. 1 2 "'Echo chambers,' polarization, and the increasing tension between the (social) reality of expertise and the (cultural) suspicion of authority". uva.theopenscholar.com. Retrieved 12 June 2020.
  30. "Echo Chamber: Rush Limbaugh and the Conservative Media Establishment". Oxford University Press & Annenberg School for Communication. Retrieved 12 June 2020.
  31. "Americans, Politics and Social Media". Pew Research Center: Internet, Science & Tech. 25 October 2016. Retrieved 12 June 2020.
  32. 1 2 Geschke, Daniel; Lorenz, Jan; Holtz, Peter (2019). "The triple-filter bubble: Using agent-based modelling to test a meta-theoretical framework for the emergence of filter bubbles and echo chambers". British Journal of Social Psychology. 58 (1): 129–149. doi:10.1111/bjso.12286. PMC   6585863 . PMID   30311947. S2CID   52965994.
  33. Cinus, Federico; Minici, Marco; Monti, Corrado; Bonchi, Francesco (9 July 2022). The effect of people recommenders on echo chambers and polarization. International AAAI Conference on Web and Social Media. Vol. 16. pp. 90–101.
  34. Hartmann, Andrew (2019). War for the Soul of America: A History of the Culture Wars (2nd ed.). Chicago, IL: University of Chicago Press. ISBN   978-0-226-62191-3.
  35. Barberá, Pablo; Jost, John T.; Nagler, Jonathan; Tucker, Joshua A.; Bonneau, Richard (21 August 2015). "Tweeting From Left to Right". Psychological Science. 26 (10): 1531–1542. doi:10.1177/0956797615594620. ISSN   0956-7976. PMID   26297377. S2CID   4649581.
  36. DiFonzo, Nicholas (11 September 2008). The Watercooler Effect: An Indispensable Guide to Understanding and Harnessing the Power of Rumors. Penguin Books. ISBN   9781440638633 . Retrieved 24 September 2017.
  37. DiFonzo, Nicholas (21 April 2011). "The Echo-Chamber Effect". The New York Times . Retrieved 24 September 2017.
  38. Difonzo, Nicolas (22 April 2011). "The Echo Chamber Effect". The New York Times. Retrieved 18 March 2017.
  39. Walter, Stefanie; Brüggemann, Michael; Engesser, Sven (21 December 2017). "Echo Chambers of Denial: Explaining User Comments on Climate Change". Environmental Communication. 12 (2): 204–217. doi:10.1080/17524032.2017.1394893. ISSN   1752-4032. S2CID   148918776.
  40. Parrish, Shane (31 July 2017). "How Filter Bubbles Distort Reality: Everything You Need to Know". Farnam Street.
  41. Tokita, Christopher; Guess, Andrew; Tarnita, Corina (2021). "Polarized information ecosystems can reorganize social networks via information cascades". PNAS. 118 (50). Bibcode:2021PNAS..11802147T. doi: 10.1073/pnas.2102147118 . PMC   8685718 . PMID   34876511.
  42. Törnberg, P. (2018). "Echo chambers and viral misinformation: Modeling fake news as complex contagion". PLOS ONE. 13 (9): e0203958. Bibcode:2018PLoSO..1303958T. doi: 10.1371/journal.pone.0203958 .
  43. Sunstein, Cass R. (June 2002). "The Law of Group Polarization". Journal of Political Philosophy. 10 (2): 175–195. doi:10.1111/1467-9760.00148. ISSN   0963-8016.
  44. Gentzkow, Matthew; Shapiro, Jesse M. (November 2011). "Ideological Segregation Online and Offline *". The Quarterly Journal of Economics. 126 (4): 1799–1839. doi:10.1093/qje/qjr044. hdl: 1811/52901 . ISSN   0033-5533.
  45. Shaw, David (19 January 1990). "Column One: News Analysis: Where Was Skepticism in Media?: Pack journalism and hysteria marked early coverage of the McMartin case. Few journalists stopped to question the believability of the prosecution's charges". Los Angeles Times .
  46. Jamieson, Kathleen; Cappella, Joseph (1 January 2008). Echo Chamber: Rush Limbaugh and the Conservative Media Establishment. Oxford University Press. ISBN   978-0-19-536682-2.
  47. "Trial By Leaks". Time . Vol. 151, no. 6. 16 February 1998. cover.
  48. Cohen, Adam (16 February 1998). "The Press And The Dress". Time.
  49. "The Clinton/Lewinsky Story: How Accurate? How Fair?" (PDF). Archived (PDF) from the original on 22 December 2018. Retrieved 12 December 2021.
  50. Chater, James (6 July 2016). "What the EU referendum result teaches us about the dangers of the echo chamber". New Statesman .
  51. Taub, Amanda (9 May 2018). "On Social Media's Fringes, Growing Extremism Targets Women". The New York Times. Retrieved 24 November 2018.
  52. Beauchamp, Zack (25 April 2018). "Incel, the misogynist ideology that inspired the deadly Toronto attack, explained". Vox. Retrieved 24 November 2018.
  53. Collegian, Brian Cunningham | The Daily (14 November 2017). "The government shouldn't let potential dangerous people go unnoticed online". The Daily Collegian.{{cite web}}: CS1 maint: multiple names: authors list (link) CS1 maint: numeric names: authors list (link)
  54. "Pro-painkiller echo chamber shaped policy amid drug epidemic". Center for Public Integrity. 19 September 2016. Retrieved 13 June 2019.
  55. Guo, Lei; A. Rohde, Jacob; Wu, H. Denis (28 January 2020). "Who is responsible for Twitter's echo chamber problem? Evidence from 2016 U.S. election networks". Information, Communication & Society. 23 (2): 234–251. doi:10.1080/1369118X.2018.1499793. ISSN   1369-118X. S2CID   149666263.
  56. NEUMAN, W. RUSSELL (July 1996). "Political Communications Infrastructure". The Annals of the American Academy of Political and Social Science. 546 (1): 9–21. doi:10.1177/0002716296546001002. ISSN   0002-7162. S2CID   154442316.
  57. Mutz, Diana C. (March 2001). "Facilitating Communication across Lines of Political Difference: The Role of Mass Media". American Political Science Review. 95 (1): 97–114. doi:10.1017/s0003055401000223. ISSN   0003-0554. S2CID   6185156.
  58. Colleoni, Elanor; Rozza, Alessandro; Arvidsson, Adam (April 2014). "Echo Chamber or Public Sphere? Predicting Political Orientation and Measuring Political Homophily in Twitter Using Big Data: Political Homophily on Twitter". Journal of Communication. 64 (2): 317–332. doi:10.1111/jcom.12084. hdl: 10281/66011 . ISSN   0021-9916.
  59. Hooton, Christopher (10 November 2016). "Your social media echo chamber is the reason Donald Trump ended up being voted President". The Independent. Retrieved 10 April 2017.
  60. "Continuing Our Updates to Trending". About Facebook. 25 January 2017. Retrieved 25 April 2020.
  61. "Echo chambers, algorithms and start-ups". LiveMint. Retrieved 12 June 2018.
  62. "Outside Your Bubble". BuzzFeed. Retrieved 5 March 2018.
  63. Smith, Ben (17 February 2017). "Helping You See Outside Your Bubble". BuzzFeed.