Filter bubble

Last updated

Social media inadvertently isolates users into their own ideological filter bubbles, according to internet activist Eli Pariser Filter bubble illustration.png
Social media inadvertently isolates users into their own ideological filter bubbles, according to internet activist Eli Pariser

A filter bubble or ideological frame is a state of intellectual isolation [1] that can result from personalized searches, recommendation systems, and algorithmic curation. The search results are based on information about the user, such as their location, past click-behavior, and search history. [2] Consequently, users become separated from information that disagrees with their viewpoints, effectively isolating them in their own cultural or ideological bubbles, resulting in a limited and customized view of the world. [3] The choices made by these algorithms are only sometimes transparent. [4] Prime examples include Google Personalized Search results and Facebook's personalized news-stream.

Contents

However there are conflicting reports about the extent to which personalized filtering happens and whether such activity is beneficial or harmful, with various studies producing inconclusive results.

The term filter bubble was coined by internet activist Eli Pariser circa 2010. In Pariser's influential book under the same name, The Filter Bubble (2011), it was predicted that individualized personalization by algorithmic filtering would lead to intellectual isolation and social fragmentation. [5] The bubble effect may have negative implications for civic discourse, according to Pariser, but contrasting views regard the effect as minimal [6] and addressable. [7] According to Pariser, users get less exposure to conflicting viewpoints and are isolated intellectually in their informational bubble. [8] He related an example in which one user searched Google for "BP" and got investment news about British Petroleum, while another searcher got information about the Deepwater Horizon oil spill, noting that the two search results pages were "strikingly different" despite use of the same key words. [8] [9] [10] [6] The results of the U.S. presidential election in 2016 have been associated with the influence of social media platforms such as Twitter and Facebook, [11] and as a result have called into question the effects of the "filter bubble" phenomenon on user exposure to fake news and echo chambers, [12] spurring new interest in the term, [13] with many concerned that the phenomenon may harm democracy and well-being by making the effects of misinformation worse. [14] [15] [13] [16] [17] [18]

Concept

The term filter bubble was coined by internet activist Eli Pariser, circa 2010. Eli Pariser, author of The Filter Bubble - Flickr - Knight Foundation (1).jpg
The term filter bubble was coined by internet activist Eli Pariser, circa 2010.

Pariser defined his concept of a filter bubble in more formal terms as "that personal ecosystem of information that's been catered by these algorithms." [8] An internet user's past browsing and search history is built up over time when they indicate interest in topics by "clicking links, viewing friends, putting movies in [their] queue, reading news stories," and so forth. [19] An internet firm then uses this information to target advertising to the user, or make certain types of information appear more prominently in search results pages. [19]

This process is not random, as it operates under a three-step process, per Pariser, who states, "First, you figure out who people are and what they like. Then, you provide them with content and services that best fit them. Finally, you tune in to get the fit just right. Your identity shapes your media." [20] Pariser also reports:

According to one Wall Street Journal study, the top fifty Internet sites, from CNN to Yahoo to MSN, install an average of 64 data-laden cookies and personal tracking beacons. Search for a word like "depression" on Dictionary.com, and the site installs up to 223 tracking cookies and beacons on your computer so that other Web sites can target you with antidepressants. Share an article about cooking on ABC News, and you may be chased around the Web by ads for Teflon-coated pots. Open—even for an instant—a page listing signs that your spouse may be cheating and prepare to be haunted by DNA paternity-test ads. [21]

Accessing the data of link clicks displayed through site traffic measurements determines that filter bubbles can be collective or individual. [22]

As of 2011, one engineer had told Pariser that Google looked at 57 different pieces of data to personally tailor a user's search results, including non-cookie data such as the type of computer being used and the user's physical location. [23]

Pariser's idea of the filter bubble was popularized after the TED talk in May 2011, in which he gave examples of how filter bubbles work and where they can be seen. In a test seeking to demonstrate the filter bubble effect, Pariser asked several friends to search for the word "Egypt" on Google and send him the results. Comparing two of the friends' first pages of results, while there was overlap between them on topics like news and travel, one friend's results prominently included links to information on the then-ongoing Egyptian revolution of 2011, while the other friend's first page of results did not include such links. [24]

In The Filter Bubble, Pariser warns that a potential downside to filtered searching is that it "closes us off to new ideas, subjects, and important information," [25] and "creates the impression that our narrow self-interest is all that exists." [9] In his view, filter bubbles are potentially harmful to both individuals and society. He criticized Google and Facebook for offering users "too much candy and not enough carrots." [26] He warned that "invisible algorithmic editing of the web" may limit our exposure to new information and narrow our outlook. [26] According to Pariser, the detrimental effects of filter bubbles include harm to the general society in the sense that they have the possibility of "undermining civic discourse" and making people more vulnerable to "propaganda and manipulation." [9] He wrote:

A world constructed from the familiar is a world in which there's nothing to learn ... (since there is) invisible autopropaganda, indoctrinating us with our own ideas.

Eli Pariser in The Economist , 2011 [27]

Many people are unaware that filter bubbles even exist. This can be seen in an article in The Guardian, which mentioned the fact that "more than 60% of Facebook users are entirely unaware of any curation on Facebook at all, believing instead that every single story from their friends and followed pages appeared in their news feed." [28] A brief explanation for how Facebook decides what goes on a user's news feed is through an algorithm that takes into account "how you have interacted with similar posts in the past." [28]

Extensions of concept

A filter bubble has been described as exacerbating a phenomenon that called splinternet or cyberbalkanization, [Note 1] which happens when the internet becomes divided into sub-groups of like-minded people who become insulated within their own online community and fail to get exposure to different views. This concern dates back to the early days of the publicly accessible internet, with the term "cyberbalkanization" being coined in 1996. [29] [30] [31] Other terms have been used to describe this phenomenon, including "ideological frames" [9] and "the figurative sphere surrounding you as you search the internet." [19]

The concept of a filter bubble has been extended into other areas, to describe societies that self-segregate according political views but also economic, social, and cultural situations. [32] That bubbling results in a loss of the broader community and creates the sense that for example, children do not belong at social events unless those events were especially planned to be appealing for children and unappealing for adults without children. [32]

Barack Obama's farewell address identified a similar concept to filter bubbles as a "threat to [Americans'] democracy," i.e., the "retreat into our own bubbles, ...especially our social media feeds, surrounded by people who look like us and share the same political outlook and never challenge our assumptions... And increasingly, we become so secure in our bubbles that we start accepting only information, whether it's true or not, that fits our opinions, instead of basing our opinions on the evidence that is out there." [33]

Comparison with echo chambers

Both "echo chambers" and "filter bubbles" describe situations where individuals are exposed to a narrow range of opinions and perspectives that reinforce their existing beliefs and biases, but there are some subtle differences between the two, especially in practices surrounding social media. [34] [35]

Specific to news media, an echo chamber is a metaphorical description of a situation in which beliefs are amplified or reinforced by communication and repetition inside a closed system. [36] [37] Based on the sociological concept of selective exposure theory, the term is a metaphor based on the acoustic echo chamber, where sounds reverberate in a hollow enclosure. With regard to social media, this sort of situation feeds into explicit mechanisms of self-selected personalization, which describes all processes in which users of a given platform can actively opt in and out of information consumption, such as a user's ability to follow other users or select into groups. [38]

In an echo chamber, people are able to seek out information that reinforces their existing views, potentially as an unconscious exercise of confirmation bias. This sort of feedback regulation may increase political and social polarization and extremism. This can lead to users aggregating into homophilic clusters within social networks, which contributes to group polarization. [39] "Echo chambers" reinforce an individual's beliefs without factual support. Individuals are surrounded by those who acknowledge and follow the same viewpoints, but they also possess the agency to break outside of the echo chambers. [40]

On the other hand, filter bubbles are implicit mechanisms of pre-selected personalization, where a user's media consumption is created by personalized algorithms; the content a user sees is filtered through an AI-driven algorithm that reinforces their existing beliefs and preferences, potentially excluding contrary or diverse perspectives. In this case, users have a more passive role and are perceived as victims of a technology that automatically limits their exposure to information that would challenge their world view. [38] Some researchers argue, however, that because users still play an active role in selectively curating their own newsfeeds and information sources through their interactions with search engines and social media networks, that they directly assist in the filtering process by AI-driven algorithms, thus effectively engaging in self-segregating filter bubbles. [41]

Despite their differences, the usage of these terms go hand-in-hand in both academic and platform studies. It is often hard to distinguish between the two concepts in social network studies, due to limitations in accessibility of the filtering algorithms, that perhaps could enable researchers to compare and contrast the agencies of the two concepts. [42] This type of research will continue to grow more difficult to conduct, as many social media networks have also begun to limit API access needed for academic research. [43]

Reactions and studies

Media reactions

There are conflicting reports about the extent to which personalized filtering happens and whether such activity is beneficial or harmful. Analyst Jacob Weisberg, writing in June 2011 for Slate , did a small non-scientific experiment to test Pariser's theory which involved five associates with different ideological backgrounds conducting a series of searches, "John Boehner," "Barney Frank," "Ryan plan," and "Obamacare," and sending Weisberg screenshots of their results. The results varied only in minor respects from person to person, and any differences did not appear to be ideology-related, leading Weisberg to conclude that a filter bubble was not in effect, and to write that the idea that most internet users were "feeding at the trough of a Daily Me " was overblown. [9] Weisberg asked Google to comment, and a spokesperson stated that algorithms were in place to deliberately "limit personalization and promote variety." [9] Book reviewer Paul Boutin did a similar experiment to Weisberg's among people with differing search histories and again found that the different searchers received nearly identical search results. [6] Interviewing programmers at Google, off the record, journalist Per Grankvist  [ sv ] found that user data used to play a bigger role in determining search results but that Google, through testing, found that the search query is by far the best determinant of what results to display. [44]

There are reports that Google and other sites maintain vast "dossiers" of information on their users, which might enable them to personalize individual internet experiences further if they choose to do so. For instance, the technology exists for Google to keep track of users' histories even if they don't have a personal Google account or are not logged into one. [6] One report stated that Google had collected "10 years' worth" of information amassed from varying sources, such as Gmail, Google Maps, and other services besides its search engine, [10] [ failed verification ] although a contrary report was that trying to personalize the internet for each user, was technically challenging for an internet firm to achieve despite the huge amounts of available data.[ citation needed ] Analyst Doug Gross of CNN suggested that filtered searching seemed to be more helpful for consumers than for citizens , and would help a consumer looking for "pizza" find local delivery options based on a personalized search and appropriately filter out distant pizza stores. [10] [ failed verification ] Organizations such as the Washington Post , The New York Times , and others have experimented with creating new personalized information services, with the aim of tailoring search results to those that users are likely to like or agree with. [9]

Academia studies and reactions

A scientific study from Wharton that analyzed personalized recommendations also found that these filters can create commonality, not fragmentation, in online music taste. [45] Consumers reportedly use the filters to expand their taste rather than to limit it. [45] Harvard law professor Jonathan Zittrain disputed the extent to which personalization filters distort Google search results, saying that "the effects of search personalization have been light." [9] Further, Google provides the ability for users to shut off personalization features if they choose [46] by deleting Google's record of their search history and setting Google not to remember their search keywords and visited links in the future. [6]

A study from Internet Policy Review addressed the lack of a clear and testable definition for filter bubbles across disciplines; this often results in researchers defining and studying filter bubbles in different ways. [47] Subsequently, the study explained a lack of empirical data for the existence of filter bubbles across disciplines [12] and suggested that the effects attributed to them may stem more from preexisting ideological biases than from algorithms. Similar views can be found in other academic projects, which also address concerns with the definitions of filter bubbles and the relationships between ideological and technological factors associated with them. [48] A critical review of filter bubbles suggested that "the filter bubble thesis often posits a special kind of political human who has opinions that are strong, but at the same time highly malleable" and that it is a "paradox that people have an active agency when they select content but are passive receivers once they are exposed to the algorithmically curated content recommended to them." [49]

A study by Oxford, Stanford, and Microsoft researchers examined the browsing histories of 1.2 million U.S. users of the Bing Toolbar add-on for Internet Explorer between March and May 2013. They selected 50,000 of those users who were active news consumers, then classified whether the news outlets they visited were left- or right-leaning, based on whether the majority of voters in the counties associated with user IP addresses voted for Obama or Romney in the 2012 presidential election. They then identified whether news stories were read after accessing the publisher's site directly, via the Google News aggregation service, web searches, or social media. The researchers found that while web searches and social media do contribute to ideological segregation, the vast majority of online news consumption consisted of users directly visiting left- or right-leaning mainstream news sites and consequently being exposed almost exclusively to views from a single side of the political spectrum. Limitations of the study included selection issues such as Internet Explorer users skewing higher in age than the general internet population; Bing Toolbar usage and the voluntary (or unknowing) sharing of browsing history selection for users who are less concerned about privacy; the assumption that all stories in left-leaning publications are left-leaning, and the same for right-leaning; and the possibility that users who are not active news consumers may get most of their news via social media, and thus experience stronger effects of social or algorithmic bias than those users who essentially self-select their bias through their choice of news publications (assuming they are aware of the publications' biases). [50]

A study by Princeton University and New York University researchers aimed to study the impact of filter bubble and algorithmic filtering on social media polarization. They used a mathematical model called the "stochastic block model" to test their hypothesis on the environments of Reddit and Twitter. The researchers gauged changes in polarization in regularized social media networks and non-regularized networks, specifically measuring the percent changes in polarization and disagreement on Reddit and Twitter. They found that polarization increased significantly at 400% in non-regularized networks, while polarization increased by 4% in regularized networks and disagreement by 5%. [51]

Platform studies

While algorithms do limit political diversity, some of the filter bubbles are the result of user choice. [52] A study by data scientists at Facebook found that users have one friend with contrasting views for every four Facebook friends that share an ideology. [53] [54] No matter what Facebook's algorithm for its News Feed is, people are more likely to befriend/follow people who share similar beliefs. [53] The nature of the algorithm is that it ranks stories based on a user's history, resulting in a reduction of the "politically cross-cutting content by 5 percent for conservatives and 8 percent for liberals." [53] However, even when people are given the option to click on a link offering contrasting views, they still default to their most viewed sources. [53] "[U]ser choice decreases the likelihood of clicking on a cross-cutting link by 17 percent for conservatives and 6 percent for liberals." [53] A cross-cutting link is one that introduces a different point of view than the user's presumed point of view or what the website has pegged as the user's beliefs. [55] A recent study from Levi Boxell, Matthew Gentzkow, and Jesse M. Shapiro suggest that online media isn't the driving force for political polarization. [56] The paper argues that polarization has been driven by the demographic groups that spend the least time online. The greatest ideological divide is experienced amongst Americans older than 75, while only 20% reported using social media as of 2012. In contrast, 80% of Americans aged 18–39 reported using social media as of 2012. The data suggests that the younger demographic isn't any more polarized in 2012 than it had been when online media barely existed in 1996. The study highlights differences between age groups and how news consumption remains polarized as people seek information that appeals to their preconceptions. Older Americans usually remain stagnant in their political views as traditional media outlets continue to be a primary source of news, while online media is the leading source for the younger demographic. Although algorithms and filter bubbles weaken content diversity, this study reveals that political polarization trends are primarily driven by pre-existing views and failure to recognize outside sources. A 2020 study from Germany utilized the Big Five Psychology model to test the effects of individual personality, demographics, and ideologies on user news consumption. [57] Basing their study on the notion that the number of news sources that users consume impacts their likelihood to be caught in a filter bubble—with higher media diversity lessening the chances—their results suggest that certain demographics (higher age and male) along with certain personality traits (high openness) correlate positively with a number of news sources consumed by individuals. The study also found a negative ideological association between media diversity and the degree to which users align with right-wing authoritarianism. Beyond offering different individual user factors that may influence the role of user choice, this study also raises questions and associations between the likelihood of users being caught in filter bubbles and user voting behavior. [57]

The Facebook study found that it was "inconclusive" whether or not the algorithm played as big a role in filtering News Feeds as people assumed. [58] The study also found that "individual choice," or confirmation bias, likewise affected what gets filtered out of News Feeds. [58] Some social scientists criticized this conclusion because the point of protesting the filter bubble is that the algorithms and individual choice work together to filter out News Feeds. [59] They also criticized Facebook's small sample size, which is about "9% of actual Facebook users," and the fact that the study results are "not reproducible" due to the fact that the study was conducted by "Facebook scientists" who had access to data that Facebook does not make available to outside researchers. [60]

Though the study found that only about 15–20% of the average user's Facebook friends subscribe to the opposite side of the political spectrum, Julia Kaman from Vox theorized that this could have potentially positive implications for viewpoint diversity. These "friends" are often acquaintances with whom we would not likely share our politics without the internet. Facebook may foster a unique environment where a user sees and possibly interacts with content posted or re-posted by these "second-tier" friends. The study found that "24 percent of the news items liberals saw were conservative-leaning and 38 percent of the news conservatives saw was liberal-leaning." [61] "Liberals tend to be connected to fewer friends who share information from the other side, compared with their conservative counterparts." [62] This interplay has the ability to provide diverse information and sources that could moderate users' views.

Similarly, a study of Twitter's filter bubbles by New York University concluded that "Individuals now have access to a wider span of viewpoints about news events, and most of this information is not coming through the traditional channels, but either directly from political actors or through their friends and relatives. Furthermore, the interactive nature of social media creates opportunities for individuals to discuss political events with their peers, including those with whom they have weak social ties." [63] According to these studies, social media may be diversifying information and opinions users come into contact with, though there is much speculation around filter bubbles and their ability to create deeper political polarization.

One driver and possible solution to the problem is the role of emotions in online content. A 2018 study shows that different emotions of messages can lead to polarization or convergence: joy is prevalent in emotional polarization, while sadness and fear play significant roles in emotional convergence. [64] Since it is relatively easy to detect the emotional content of messages, these findings can help to design more socially responsible algorithms by starting to focus on the emotional content of algorithmic recommendations.

Visualization of the process and growth of two social media bots used in the 2019 Weibo study. The diagrams represent two aspects of the structure of filter bubbles, according to the study: large concentrations of users around single topics and a uni-directional, star-like structure that impacts key information flows. Social Network Visualization Experiment.jpg
Visualization of the process and growth of two social media bots used in the 2019 Weibo study. The diagrams represent two aspects of the structure of filter bubbles, according to the study: large concentrations of users around single topics and a uni-directional, star-like structure that impacts key information flows.

Social bots have been utilized by different researchers to test polarization and related effects that are attributed to filter bubbles and echo chambers. [65] [66] A 2018 study used social bots on Twitter to test deliberate user exposure to partisan viewpoints. [65] The study claimed it demonstrated partisan differences between exposure to differing views, although it warned that the findings should be limited to party-registered American Twitter users. One of the main findings was that after exposure to differing views (provided by the bots), self-registered republicans became more conservative, whereas self-registered liberals showed less ideological change if none at all. A different study from The People's Republic of China utilized social bots on Weibo—the largest social media platform in China—to examine the structure of filter bubbles regarding to their effects on polarization. [66] The study draws a distinction between two conceptions of polarization. One being where people with similar views form groups, share similar opinions, and block themselves from differing viewpoints (opinion polarization), and the other being where people do not access diverse content and sources of information (information polarization). By utilizing social bots instead of human volunteers and focusing more on information polarization rather than opinion-based, the researchers concluded that there are two essential elements of a filter bubble: a large concentration of users around a single topic and a uni-directional, star-like structure that impacts key information flows.

In June 2018, the platform DuckDuckGo conducted a research study on the Google Web Browser Platform. For this study, 87 adults in various locations around the continental United States googled three keywords at the exact same time: immigration, gun control, and vaccinations. Even in private browsing mode, most people saw results unique to them. Google included certain links for some that it did not include for other participants, and the News and Videos infoboxes showed significant variation. Google publicly disputed these results saying that Search Engine Results Page (SERP) personalization is mostly a myth. Google Search Liaison, Danny Sullivan, stated that “Over the years, a myth has developed that Google Search personalizes so much that for the same query, different people might get significantly different results from each other. This isn’t the case. Results can differ, but usually for non-personalized reasons.” [67]

When filter bubbles are in place, they can create specific moments that scientists call 'Whoa' moments. A 'Whoa' moment is when an article, ad, post, etc., appears on your computer that is in relation to a current action or current use of an object. Scientists discovered this term after a young woman was performing her daily routine, which included drinking coffee when she opened her computer and noticed an advertisement for the same brand of coffee that she was drinking. "Sat down and opened up Facebook this morning while having my coffee, and there they were two ads for Nespresso. Kind of a 'whoa' moment when the product you're drinking pops up on the screen in front of you." [68] "Whoa" moments occur when people are "found." Which means advertisement algorithms target specific users based on their "click behavior" to increase their sale revenue.

Several designers have developed tools to counteract the effects of filter bubbles (see § Countermeasures). [69] Swiss radio station SRF voted the word filterblase (the German translation of filter bubble) word of the year 2016. [70]

Countermeasures

By individuals

In The Filter Bubble: What the Internet Is Hiding from You, [71] internet activist Eli Pariser highlights how the increasing occurrence of filter bubbles further emphasizes the value of one's bridging social capital as defined by Robert Putman. Pariser argues that filter bubbles reinforce a sense of social homogeneity, which weakens ties between people with potentially diverging interests and viewpoints. [72] In that sense, high bridging capital may promote social inclusion by increasing our exposure to a space that goes beyond self-interests. Fostering one's bridging capital, such as by connecting with more people in an informal setting, may be an effective way to reduce the filter bubble phenomenon.

Users can take many actions to burst through their filter bubbles, for example by making a conscious effort to evaluate what information they are exposing themselves to, and by thinking critically about whether they are engaging with a broad range of content. [73] Users can consciously avoid news sources that are unverifiable or weak. Chris Glushko, the VP of Marketing at IAB, advocates using fact-checking sites to identify fake news. [74] Technology can also play a valuable role in combating filter bubbles. [75]

Some browser plug-ins are aimed to help people step out of their filter bubbles and make them aware of their personal perspectives; thus, these media show content that contradicts with their beliefs and opinions. In addition to plug-ins, there are apps created with the mission of encouraging users to open their echo chambers. News apps such as Read Across the Aisle nudge users to read different perspectives if their reading pattern is biased towards one side/ideology. [76] Although apps and plug-ins are tools humans can use, Eli Pariser stated "certainly, there is some individual responsibility here to really seek out new sources and people who aren't like you." [52]

Since web-based advertising can further the effect of the filter bubbles by exposing users to more of the same content, users can block much advertising by deleting their search history, turning off targeted ads, and downloading browser extensions. Some use anonymous or non-personalized search engines such as YaCy, DuckDuckGo, Qwant, Startpage.com, Disconnect, and Searx in order to prevent companies from gathering their web-search data. Swiss daily Neue Zürcher Zeitung is beta-testing a personalized news engine app which uses machine learning to guess what content a user is interested in, while "always including an element of surprise"; the idea is to mix in stories which a user is unlikely to have followed in the past. [77]

The European Union is taking measures to lessen the effect of the filter bubble. The European Parliament is sponsoring inquiries into how filter bubbles affect people's ability to access diverse news. [78] Additionally, it introduced a program aimed to educate citizens about social media. [79] In the U.S., the CSCW panel suggests the use of news aggregator apps to broaden media consumers news intake. News aggregator apps scan all current news articles and direct you to different viewpoints regarding a certain topic. Users can also use a diversely-aware news balancer which visually shows the media consumer if they are leaning left or right when it comes to reading the news, indicating right-leaning with a bigger red bar or left-leaning with a bigger blue bar. A study evaluating this news balancer found "a small but noticeable change in reading behavior, toward more balanced exposure, among users seeing the feedback, as compared to a control group". [80]

By media companies

In light of recent concerns about information filtering on social media, Facebook acknowledged the presence of filter bubbles and has taken strides toward removing them. [81] In January 2017, Facebook removed personalization from its Trending Topics list in response to problems with some users not seeing highly talked-about events there. [82] Facebook's strategy is to reverse the Related Articles feature that it had implemented in 2013, which would post related news stories after the user read a shared article. Now, the revamped strategy would flip this process and post articles from different perspectives on the same topic. Facebook is also attempting to go through a vetting process whereby only articles from reputable sources will be shown. Along with the founder of Craigslist and a few others, Facebook has invested $14 million into efforts "to increase trust in journalism around the world, and to better inform the public conversation". [81] The idea is that even if people are only reading posts shared from their friends, at least these posts will be credible.

Similarly, Google, as of January 30, 2018, has also acknowledged the existence of a filter bubble difficulties within its platform. Because current Google searches pull algorithmically ranked results based upon "authoritativeness" and "relevancy" which show and hide certain search results, Google is seeking to combat this. By training its search engine to recognize the intent of a search inquiry rather than the literal syntax of the question, Google is attempting to limit the size of filter bubbles. As of now, the initial phase of this training will be introduced in the second quarter of 2018. Questions that involve bias and/or controversial opinions will not be addressed until a later time, prompting a larger problem that exists still: whether the search engine acts either as an arbiter of truth or as a knowledgeable guide by which to make decisions by. [83]

In April 2017 news surfaced that Facebook, Mozilla, and Craigslist contributed to the majority of a $14M donation to CUNY's "News Integrity Initiative," poised at eliminating fake news and creating more honest news media. [84]

Later, in August, Mozilla, makers of the Firefox web browser, announced the formation of the Mozilla Information Trust Initiative (MITI). The +MITI would serve as a collective effort to develop products, research, and community-based solutions to combat the effects of filter bubbles and the proliferation of fake news. Mozilla's Open Innovation team leads the initiative, striving to combat misinformation, with a specific focus on the product with regards to literacy, research and creative interventions. [85]

Ethical implications

As the popularity of cloud services increases, personalized algorithms used to construct filter bubbles are expected to become more widespread. [86] Scholars have begun considering the effect of filter bubbles on the users of social media from an ethical standpoint, particularly concerning the areas of personal freedom, security, and information bias. [87] Filter bubbles in popular social media and personalized search sites can determine the particular content seen by users, often without their direct consent or cognizance, [86] due to the algorithms used to curate that content. Self-created content manifested from behavior patterns can lead to partial information blindness. [88] Critics of the use of filter bubbles speculate that individuals may lose autonomy over their own social media experience and have their identities socially constructed as a result of the pervasiveness of filter bubbles. [86]

Technologists, social media engineers, and computer specialists have also examined the prevalence of filter bubbles. [89] Mark Zuckerberg, founder of Facebook, and Eli Pariser, author of The Filter Bubble, have expressed concerns regarding the risks of privacy and information polarization. [90] [91] The information of the users of personalized search engines and social media platforms is not private, though some people believe it should be. [90] The concern over privacy has resulted in a debate as to whether or not it is moral for information technologists to take users' online activity and manipulate future exposure to related information. [91]

Some scholars have expressed concerns regarding the effects of filter bubbles on individual and social well-being, i.e. the dissemination of health information to the general public and the potential effects of internet search engines to alter health-related behavior. [16] [17] [18] [92] A 2019 multi-disciplinary book reported research and perspectives on the roles filter bubbles play in regards to health misinformation. [18] Drawing from various fields such as journalism, law, medicine, and health psychology, the book addresses different controversial health beliefs (e.g. alternative medicine and pseudoscience) as well as potential remedies to the negative effects of filter bubbles and echo chambers on different topics in health discourse. A 2016 study on the potential effects of filter bubbles on search engine results related to suicide found that algorithms play an important role in whether or not helplines and similar search results are displayed to users and discussed the implications their research may have for health policies. [17] Another 2016 study from the Croatian Medical journal proposed some strategies for mitigating the potentially harmful effects of filter bubbles on health information, such as: informing the public more about filter bubbles and their associated effects, users choosing to try alternative [to Google] search engines, and more explanation of the processes search engines use to determine their displayed results. [16]

Since the content seen by individual social media users is influenced by algorithms that produce filter bubbles, users of social media platforms are more susceptible to confirmation bias, [93] and may be exposed to biased, misleading information. [94] Social sorting and other unintentional discriminatory practices are also anticipated as a result of personalized filtering. [95]

In light of the 2016 U.S. presidential election scholars have likewise expressed concerns about the effect of filter bubbles on democracy and democratic processes, as well as the rise of "ideological media". [11] These scholars fear that users will be unable to "[think] beyond [their] narrow self-interest" as filter bubbles create personalized social feeds, isolating them from diverse points of view and their surrounding communities. [96] For this reason, an increasingly discussed possibility is to design social media with more serendipity, that is, to proactively recommend content that lies outside one's filter bubble, including challenging political information and, eventually, to provide empowering filters and tools to users. [97] [98] [99] A related concern is in fact how filter bubbles contribute to the proliferation of "fake news" and how this may influence political leaning, including how users vote. [11] [100] [101]

Revelations in March 2018 of Cambridge Analytica's harvesting and use of user data for at least 87 million Facebook profiles during the 2016 presidential election highlight the ethical implications of filter bubbles. [102] Co-founder and whistleblower of Cambridge Analytica Christopher Wylie, detailed how the firm had the ability to develop "psychographic" profiles of those users and use the information to shape their voting behavior. [103] Access to user data by third parties such as Cambridge Analytica can exasperate and amplify existing filter bubbles users have created, artificially increasing existing biases and further divide societies.

Dangers

Filter bubbles have stemmed from a surge in media personalization, which can trap users. The use of AI to personalize offerings can lead to users viewing only content that reinforces their own viewpoints without challenging them. Social media websites like Facebook may also present content in a way that makes it difficult for users to determine the source of the content, leading them to decide for themselves whether the source is reliable or fake. [104] That can lead to people becoming used to hearing what they want to hear, which can cause them to react more radically when they see an opposing viewpoint. The filter bubble may cause the person to see any opposing viewpoints as incorrect and so could allow the media to force views onto consumers. [105] [104] [106]

Researches explain that the filter bubble reinforces what one is already thinking. [107] This is why it is extremely important to utilize resources that offer various points of view. [107]

See also

Notes

  1. The term cyber-balkanization (sometimes with a hyphen) is a hybrid of cyber, relating to the internet, and Balkanization , referring to that region of Europe that was historically subdivided by languages, religions and cultures; the term was coined in a paper by MIT researchers Van Alstyne and Brynjolfsson.

Related Research Articles

<span class="mw-page-title-main">Google Search</span> Search engine from Google

Google Search is a search engine operated by Google. It allows users to search for information on the Internet by entering keywords or phrases. Google Search uses algorithms to analyze and rank websites based on their relevance to the search query. It is the most popular search engine worldwide.

Political polarization is the divergence of political attitudes away from the center, towards ideological extremes. Scholars distinguish between ideological polarization and affective polarization.

A recommender system, or a recommendation system, is a subclass of information filtering system that provides suggestions for items that are most pertinent to a particular user. Recommender systems are particularly useful when an individual needs to choose an item from a potentially overwhelming number of items that a service may offer.

Personalization consists of tailoring a service or product to accommodate specific individuals. It is sometimes tied to groups or segments of individuals. Personalization involves collecting data on individuals, including web browsing history, web cookies, and location. Various organizations use personalization to improve customer satisfaction, digital sales conversion, marketing results, branding, and improved website metrics as well as for advertising. Personalization acts as a key element in social media and recommender systems. Personalization influences every sector of society — be it work, leisure, or citizenship.

<span class="mw-page-title-main">News aggregator</span> Client software that aggregates syndicated web content

In computing, a news aggregator, also termed a feed aggregator, content aggregator, feed reader, news reader, or simply an aggregator, is client software or a web application that aggregates digital content such as online newspapers, blogs, podcasts, and video blogs (vlogs) in one location for easy viewing. The updates distributed may include journal tables of contents, podcasts, videos, and news items.

<span class="mw-page-title-main">Eli Pariser</span> Author, activist, and entrepreneur

Eli Pariser is an author, activist, and entrepreneur. He has stated that his focus is "how to make technology and media serve democracy". He became executive director of MoveOn.org in 2004, where he helped pioneer the practice of online citizen engagement. He is the co-founder of Upworthy, a website for meaningful viral content, and Avaaz, a global citizen's organization. His bestselling book, The Filter Bubble: What the Internet Is Hiding from You, introduced the term “filter bubble” to the lexicon. He is currently an Omidyar Fellow at New America and co-directs New_ Public.

<span class="mw-page-title-main">Search engine</span> Software system for finding relevant information on the Web

A search engine is a software system that provides hyperlinks to web pages and other relevant information on the Web in response to a user's query. The user inputs a query within a web browser or a mobile app, and the search results are often a list of hyperlinks, accompanied by textual summaries and images. Users also have the option of limiting the search to a specific type of results, such as images, videos, or news.

Google Personalized Search is a personalized search feature of Google Search, introduced in 2004. All searches on Google Search are associated with a browser cookie record. When a user performs a search, the search results are not only based on the relevance of each web page to the search term, but also on which websites the user visited through previous search results. This provides a more personalized experience that can increase the relevance of the search results for the particular user. Such filtering may also have side effects, such as the creation of a filter bubble.

<span class="mw-page-title-main">Social media</span> Virtual online communities

Social media are interactive technologies that facilitate the creation, sharing and aggregation of content amongst virtual communities and networks. Common features include:

Social search is a behavior of retrieving and searching on a social searching engine that mainly searches user-generated content such as news, videos and images related search queries on social media like Facebook, LinkedIn, Twitter, Instagram and Flickr. It is an enhanced version of web search that combines traditional algorithms. The idea behind social search is that instead of ranking search results purely based on semantic relevance between a query and the results, a social search system also takes into account social relationships between the results and the searcher. The social relationships could be in various forms. For example, in LinkedIn people search engine, the social relationships include social connections between searcher and each result, whether or not they are in the same industries, work for the same companies, belong the same social groups, and go the same schools, etc.

<span class="mw-page-title-main">Internet censorship</span> Legal control of the internet

Internet censorship is the legal control or suppression of what can be accessed, published, or viewed on the Internet. Censorship is most often applied to specific internet domains but exceptionally may extend to all Internet resources located outside the jurisdiction of the censoring state. Internet censorship may also put restrictions on what information can be made internet accessible. Organizations providing internet access – such as schools and libraries – may choose to preclude access to material that they consider undesirable, offensive, age-inappropriate or even illegal, and regard this as ethical behavior rather than censorship. Individuals and organizations may engage in self-censorship of material they publish, for moral, religious, or business reasons, to conform to societal norms, political views, due to intimidation, or out of fear of legal or other consequences.

<span class="mw-page-title-main">Echo chamber (media)</span> Situation that reinforces beliefs by repetition inside a closed system

In news media and social media, an echo chamber is an environment or ecosystem in which participants encounter beliefs that amplify or reinforce their preexisting beliefs by communication and repetition inside a closed system and insulated from rebuttal. An echo chamber circulates existing views without encountering opposing views, potentially resulting in confirmation bias. Echo chambers may increase social and political polarization and extremism. On social media, it is thought that echo chambers limit exposure to diverse perspectives, and favor and reinforce presupposed narratives and ideologies.

Search neutrality is a principle that search engines should have no editorial policies other than that their results be comprehensive, impartial and based solely on relevance. This means that when a user types in a search engine query, the engine should return the most relevant results found in the provider's domain, without manipulating the order of the results, excluding results, or in any other way manipulating the results to a certain bias.

Personalized search is a web search tailored specifically to an individual's interests by incorporating information about the individual beyond the specific query provided. There are two general approaches to personalizing search results, involving modifying the user's query and re-ranking search results.

EdgeRank is the name commonly given to the algorithm that Facebook uses to determine what articles should be displayed in a user's News Feed. As of 2011, Facebook has stopped using the EdgeRank system and uses a machine learning algorithm that, as of 2013, takes more than 100,000 factors into account.

<span class="mw-page-title-main">Facebook Graph Search</span> Semantic search engine by Facebook

Facebook Graph Search was a semantic search engine that Facebook introduced in March 2013. It was designed to give answers to user natural language queries rather than a list of links. The name refers to the social graph nature of Facebook, which maps the relationships among users. The Graph Search feature combined the big data acquired from its over one billion users and external data into a search engine providing user-specific search results. In a presentation headed by Facebook CEO Mark Zuckerberg, it was announced that the Graph Search algorithm finds information from within a user's network of friends. Microsoft's Bing search engine provided additional results. In July it was made available to all users using the U.S. English version of Facebook. After being made less publicly visible starting December 2014, the original Graph Search was almost entirely deprecated in June 2019.

Internet manipulation is the co-optation of online digital technologies, including algorithms, social bots, and automated scripts, for commercial, social, military, or political purposes. Internet and social media manipulation are the prime vehicles for spreading disinformation due to the importance of digital platforms for media consumption and everyday communication. When employed for political purposes, internet manipulation may be used to steer public opinion, polarise citizens, circulate conspiracy theories, and silence political dissidents. Internet manipulation can also be done for profit, for instance, to harm corporate or political adversaries and improve brand reputation. Internet manipulation is sometimes also used to describe the selective enforcement of Internet censorship or selective violations of net neutrality.

<span class="mw-page-title-main">Feed (Facebook)</span> Feature of the social network Facebook

Facebook's Feed, formerly known as the News Feed, is a web feed feature for the social network. The feed is the primary system through which users are exposed to content posted on the network. Feed highlights information that includes profile changes, upcoming events, and birthdays, among other updates. Using a proprietary method, Facebook selects a handful of updates to show users every time they visit their feed, out of an average of 2,000 updates they can potentially receive. Over two billion people use Facebook every month, making the network's Feed the most viewed and most influential aspect of the news industry. The feature, introduced in 2006, was renamed "Feed" in 2022.

<span class="mw-page-title-main">Media pluralism</span> Plurality of voices, opinions, and analyses in media systems

Media pluralism defines the state of having a plurality of voices, opinions, and analyses in media systems or the coexistence of different and diverse types of medias and media support.

Algorithmic radicalization is the concept that recommender algorithms on popular social media sites such as YouTube and Facebook drive users toward progressively more extreme content over time, leading to them developing radicalized extremist political views. Algorithms record user interactions, from likes/dislikes to amount of time spent on posts, to generate endless media aimed to keep users engaged. Through echo chamber channels, the consumer is driven to be more polarized through preferences in media and self-confirmation.

References

  1. Technopedia, Definition – What does Filter Bubble mean? Archived 2017-10-10 at the Wayback Machine , Retrieved October 10, 2017, "....A filter bubble is the intellectual isolation, that can occur when websites make use of algorithms to selectively assume the information a user would want to see, and then give information to the user according to this assumption ... A filter bubble, therefore, can cause users to get significantly less contact with contradicting viewpoints, causing the user to become intellectually isolated...."
  2. Bozdag, Engin (September 2013). "Bias in algorithmic filtering and personalization". Ethics and Information Technology. 15 (3): 209–227. doi:10.1007/s10676-013-9321-6. S2CID   14970635.
  3. Huffington Post, The Huffington Post "Are Filter-bubbles Shrinking Our Minds?" Archived 2016-11-03 at the Wayback Machine
  4. Encrypt, Search (February 26, 2019). "What Are Filter Bubbles & How To Avoid Them". Search Encrypt Blog. Archived from the original on February 25, 2019. Retrieved March 19, 2019.
  5. Kitchens, Brent; Johnson, Steve L.; Gray, Peter (December 1, 2020). "Understanding Echo Chambers and Filter Bubbles: The Impact of Social Media on Diversification and Partisan Shifts in News Consumption". MIS Quarterly. 44 (4): 1619–1649. doi:10.25300/MISQ/2020/16371. S2CID   229294134.
  6. 1 2 3 4 5 Boutin, Paul (May 20, 2011). "Your Results May Vary: Will the information superhighway turn into a cul-de-sac because of automated filters?". The Wall Street Journal. Archived from the original on April 5, 2015. Retrieved August 15, 2011. By tracking individual Web browsers with cookies, Google has been able to personalize results even for users who don't create a personal Google account or are not logged into one. ...
  7. Zhang, Yuan Cao; Séaghdha, Diarmuid Ó; Quercia, Daniele; Jambor, Tamas (2012). "Auralist: Introducing serendipity into music recommendation". Proceedings of the fifth ACM international conference on Web search and data mining. pp. 13–22. doi:10.1145/2124295.2124300. ISBN   9781450307475. S2CID   2956587.
  8. 1 2 3 Parramore, Lynn (October 10, 2010). "The Filter Bubble". The Atlantic. Archived from the original on August 22, 2017. Retrieved April 20, 2011. Since December 4, 2009, Google has been personalized for everyone. So when I had two friends this spring Google "BP," one of them got a set of links that was about investment opportunities in BP. The other one got information about the oil spill....
  9. 1 2 3 4 5 6 7 8 Weisberg, Jacob (June 10, 2011). "Bubble Trouble: Is Web personalization turning us into solipsistic twits?". Slate. Archived from the original on June 12, 2011. Retrieved August 15, 2011.
  10. 1 2 3 Gross, Doug (May 19, 2011). "What the Internet is hiding from you". CNN. Archived from the original on April 9, 2016. Retrieved August 15, 2011. I had friends Google BP when the oil spill was happening. These are two women who were quite similar in a lot of ways. One got a lot of results about the environmental consequences of what was happening and the spill. The other one just got investment information and nothing about the spill at all.
  11. 1 2 3 Baer, Drake. "The 'Filter Bubble' Explains Why Trump Won and You Didn't See It Coming". Science of Us. Archived from the original on April 19, 2017. Retrieved April 19, 2017.
  12. 1 2 DiFranzo, Dominic; Gloria-Garcia, Kristine (April 5, 2017). "Filter bubbles and fake news". XRDS. 23 (3): 32–35. doi:10.1145/3055153. S2CID   7069187.
  13. 1 2 Jasper Jackson (January 8, 2017). "Eli Pariser: activist whose filter bubble warnings presaged Trump and Brexit: Upworthy chief warned about dangers of the internet's echo chambers five years before 2016's votes". The Guardian. Archived from the original on March 7, 2017. Retrieved March 3, 2017. ..."If you only see posts from folks who are like you, you're going to be surprised when someone very unlike you wins the presidency," Pariser tells The Guardian....
  14. Mostafa M. El-Bermawy (November 18, 2016). "Your Filter Bubble is Destroying Democracy". Wired. Archived from the original on March 9, 2017. Retrieved March 3, 2017. ...The global village that was once the internet ... digital islands of isolation that are drifting further apart each day ... your experience online grows increasingly personalized ...
  15. Drake Baer (November 9, 2016). "The 'Filter Bubble' Explains Why Trump Won and You Didn't See It Coming". New York Magazine. Archived from the original on February 26, 2017. Retrieved March 3, 2017. ...Trump's victory is blindsiding ... because, as media scholars understand it, we increasingly live in a "filter bubble": The information we take in is so personalized that we're blind to other perspectives....
  16. 1 2 3 Holone, Harald (June 2016). "The filter bubble and its effect on online personal health information". Croatian Medical Journal. 57 (3): 298–301. doi:10.3325/cmj.2016.57.298. PMC   4937233 . PMID   27374832.
  17. 1 2 3 Haim, Mario; Arendt, Florian; Scherr, Sebastian (February 2017). "Abyss or Shelter? On the Relevance of Web Search Engines' Search Results When People Google for Suicide". Health Communication. 32 (2): 253–258. doi:10.1080/10410236.2015.1113484. PMID   27196394. S2CID   3399012.
  18. 1 2 3 "Medical Misinformation and Social Harm in Non-Science Based Health Practices: A Multidisciplinary Perspective". CRC Press. Archived from the original on August 4, 2020. Retrieved April 22, 2020.
  19. 1 2 3 Lazar, Shira (June 1, 2011). "Algorithms and the Filter Bubble Ruining Your Online Experience?". Huffington Post. Archived from the original on April 13, 2016. Retrieved August 15, 2011. a filter bubble is the figurative sphere surrounding you as you search the Internet.
  20. Pariser, Eli (May 12, 2011). The Filter Bubble: How the New Personalized Web Is Changing What We Read and How We Think. Penguin. ISBN   9781101515129. Archived from the original on January 19, 2021. Retrieved October 11, 2020.
  21. "How Filter Bubbles Distort Reality: Everything You Need to Know". July 31, 2017. Archived from the original on July 3, 2019. Retrieved June 23, 2019.
  22. Nikolov, Dimitar; Oliveira, Diego F.M.; Flammini, Alessandro; Menczer, Filippo (December 2, 2015). "Measuring online social bubbles". PeerJ Computer Science. 1: e38. arXiv: 1502.07162 . Bibcode:2015arXiv150207162N. doi: 10.7717/peerj-cs.38 .
  23. Pariser, Eli (March 2011). "Beware online "filter bubbles"". Archived from the original on May 28, 2018. Retrieved May 30, 2018.
  24. Pariser, Eli (March 2011). "Beware online 'filter bubbles'". TED.com . Archived from the original on September 22, 2017. Retrieved September 24, 2017.
  25. "First Monday: What's on tap this month on TV and in movies and books: The Filter Bubble by Eli Pariser". USA Today. 2011. Archived from the original on May 3, 2011. Retrieved April 20, 2011. Pariser explains that feeding us only what is familiar and comfortable to us closes us off to new ideas, subjects and important information.
  26. 1 2 Bosker, Bianca (March 7, 2011). "Facebook, Google Giving Us Information Junk Food, Eli Pariser Warns". Huffington Post. Archived from the original on March 13, 2011. Retrieved April 20, 2011. When it comes to content, Google and Facebook are offering us too much candy, and not enough carrots.
  27. "Invisible sieve: Hidden, specially for you". The Economist. June 30, 2011. Archived from the original on July 3, 2011. Retrieved June 27, 2011. Mr Pariser's book provides a survey of the internet's evolution towards personalisation, examines how presenting information alters the way in which it is perceived and concludes with prescriptions for bursting the filter bubble that surrounds each user.
  28. 1 2 Hern (May 22, 2017). "How social media filter bubbles and algorithms influence the election". The Guardian. Archived from the original on May 31, 2018. Retrieved May 30, 2018.
  29. Van Alstyne, Marshall; Brynjolfsson, Erik (March 1997) [Copyright 1996]. "Electronic Communities: Global Village or Cyberbalkans?" (PDF). Archived (PDF) from the original on April 5, 2016. Retrieved September 24, 2017.
  30. Van Alstyne, Marshall; Brynjolfsson, Erik (November 1996). "Could the Internet Balkanize Science?". Science. 274 (5292): 1479–1480. Bibcode:1996Sci...274.1479V. doi:10.1126/science.274.5292.1479. S2CID   62546078.
  31. Alex Pham; Jon Healey (September 24, 2005). "Systems hope to tell you what you'd like: 'Preference engines' guide users through the flood of content". Chicago Tribune. Archived from the original on December 8, 2015. Retrieved December 4, 2015. ...if recommenders were perfect, I can have the option of talking to only people who are just like me....Cyber-balkanization, as Brynjolfsson coined the scenario, is not an inevitable effect of recommendation tools.
  32. 1 2 Menkedick, Sarah (May 14, 2020). "Why are American kids treated as a different species from adults?". Aeon. Archived from the original on May 15, 2020. Retrieved May 15, 2020.
  33. Obama, Barack (January 10, 2017). President Obama's Farewell Address (Speech). Washington, D.C. Archived from the original on January 24, 2017. Retrieved January 24, 2017.
  34. Hosanagar, Kartik (November 25, 2016). "Blame the Echo Chamber on Facebook. But Blame Yourself, Too". Wired. Archived from the original on September 25, 2017. Retrieved September 24, 2017.
  35. DiFonzo, Nicholas (April 21, 2011). "The Echo-Chamber Effect". The New York Times . Archived from the original on June 13, 2017. Retrieved September 24, 2017.
  36. sdf (June 23, 2004). "John Gorenfeld, Moon the Messiah, and the Media Echo Chamber". Daily Kos . Archived from the original on May 2, 2016. Retrieved September 24, 2017.
  37. Jamieson, Kathleen Hall; Cappella, Joseph N. (July 22, 2008). Echo Chamber: Rush Limbaugh and the Conservative Media Establishment . Oxford University Press. ISBN   978-0-19-536682-2 . Retrieved September 24, 2017.
  38. 1 2 "What are Filter Bubbles and Digital Echo Chambers? | Heinrich-Böll-Stiftung | Tel Aviv - Israel". Heinrich-Böll-Stiftung. Retrieved March 8, 2023.
  39. Cinelli, Matteo; De Francisci Morales, Gianmarco; Galeazzi, Alessandro; Quattrociocchi, Walter; Starnini, Michele (March 2, 2021). "The echo chamber effect on social media". Proceedings of the National Academy of Sciences. 118 (9): e2023301118. Bibcode:2021PNAS..11823301C. doi: 10.1073/pnas.2023301118 . ISSN   0027-8424. PMC   7936330 . PMID   33622786.
  40. Elanor Colleoni; Alessandro Rozza; Adam Arvidsson (April 2014). "Echo Chamber or Public Sphere? Predicting Political Orientation and Measuring Political Homophily in Twitter Using Big Data". Journal of Communication. 64 (2): 317–332. doi:10.1111/jcom.12084. hdl: 10281/66011 .
  41. Ekström, Axel G.; Niehorster, Diederick C.; Olsson, Erik J. (August 1, 2022). "Self-imposed filter bubbles: Selective attention and exposure in online search". Computers in Human Behavior Reports. 7: 100226. doi: 10.1016/j.chbr.2022.100226 . ISSN   2451-9588. S2CID   251434172.
  42. Reviglio, Urbano; Agosti, Claudio (April 2020). "Thinking Outside the Black-Box: The Case for "Algorithmic Sovereignty" in Social Media". Social Media + Society. 6 (2): 205630512091561. doi:10.1177/2056305120915613. hdl: 2434/840214 . ISSN   2056-3051. S2CID   219019544.
  43. "Twitter's plan to cut off free data access evokes 'fair amount of panic' among scientists". www.science.org. Retrieved March 8, 2023.
  44. Grankvist, Per (February 8, 2018). The Big Bubble: How Technology Makes It Harder to Understand the World. United Stories Publishing. p. 179. ISBN   978-91-639-5990-5.
  45. 1 2 Hosanagar, Kartik; Fleder, Daniel; Lee, Dokyun; Buja, Andreas (December 2013). "Will the Global Village Fracture into Tribes: Recommender Systems and their Effects on Consumers". Management Science, Forthcoming. SSRN   1321962.
  46. Ludwig, Amber. "Google Personalization on Your Search Results Plus How to Turn it Off". NGNG. Archived from the original on August 17, 2011. Retrieved August 15, 2011. Google customizing search results is an automatic feature, but you can shut this feature off.
  47. Bruns, Axel (November 29, 2019). "Filter bubble". Internet Policy Review. 8 (4). doi: 10.14763/2019.4.1426 . hdl: 10419/214088 .
  48. Davies, Huw C (September 2018). "Redefining Filter Bubbles as (Escapable) Socio-Technical Recursion". Sociological Research Online. 23 (3): 637–654. doi:10.1177/1360780418763824. S2CID   149367030. Archived from the original on January 19, 2021. Retrieved August 29, 2020.
  49. Dahlgren, Peter M. (January 29, 2021). "A critical review of filter bubbles and a comparison with selective exposure". Nordicom Review. 42 (1): 15–33. doi: 10.2478/nor-2021-0002 .
  50. Flaxman, Seth; Goel, Sharad; Rao, Justin M. (2016). "Filter Bubbles, Echo Chambers, and Online News Consumption". Public Opinion Quarterly. 80 (S1): 298–320. doi:10.1093/poq/nfw006. S2CID   2386849 .
  51. Chitra, Uthsav; Musco, Christopher (2020). "Analyzing the Impact of Filter Bubbles on Social Network Polarization". WSDM '20: Proceedings of the 13th International Conference on Web Search and Data Mining. pp. 115–123. doi:10.1145/3336191.3371825.
  52. 1 2 "5 Questions with Eli Pariser, Author of 'The Filter Bubble'". Time. May 16, 2011. Archived from the original on April 14, 2017. Retrieved May 24, 2017.
  53. 1 2 3 4 5 Bleiberg, Joshua; West, Darrell M. (May 24, 2017). "Political polarization on Facebook". Brookings Institution. Archived from the original on October 10, 2017. Retrieved May 24, 2017.
  54. Bakshy, E.; Messing, S.; Adamic, L. A. (June 5, 2015). "Exposure to ideologically diverse news and opinion on Facebook". Science. 348 (6239): 1130–1132. Bibcode:2015Sci...348.1130B. doi: 10.1126/science.aaa1160 . PMID   25953820. S2CID   206632821.
  55. Lumb (May 8, 2015). "Why Scientists Are Upset About The Facebook Filter Bubble Study". Archived from the original on November 11, 2017. Retrieved November 10, 2017.
  56. Oremus, Will (April 5, 2017). "The Filter Bubble Revisited". Slate Magazine. Archived from the original on February 6, 2020. Retrieved March 2, 2020.
  57. 1 2 Sindermann, Cornelia; Elhai, Jon D.; Moshagen, Morten; Montag, Christian (January 2020). "Age, gender, personality, ideological attitudes and individual differences in a person's news spectrum: how many and who might be prone to 'filter bubbles' and 'echo chambers' online?". Heliyon. 6 (1): e03214. Bibcode:2020Heliy...603214S. doi: 10.1016/j.heliyon.2020.e03214 . PMC   7002846 . PMID   32051860.
  58. 1 2 Pariser, Eli (May 7, 2015). "Fun facts from the new Facebook filter bubble study". Medium. Archived from the original on November 11, 2017. Retrieved October 24, 2017.
  59. Lumb, David (May 8, 2015). "Why Scientists Are Upset About The Facebook Filter Bubble Study". Fast Company. Archived from the original on October 23, 2017. Retrieved October 24, 2017.
  60. Pariser, Eli (May 7, 2015). "Did Facebook's Big Study Kill My Filter Bubble Thesis?". Wired. Archived from the original on November 11, 2017. Retrieved October 24, 2017.
  61. "Contrary to what you've heard, Facebook can help puncture our political "bubbles"". Vox. Archived from the original on June 13, 2018. Retrieved May 30, 2018.
  62. Bakshy, E.; Messing, S.; Adamic, L. A. (2015). "Exposure to ideologically diverse news and opinion on Facebook". Science. 348 (6239): 1130–1132. Bibcode:2015Sci...348.1130B. doi: 10.1126/science.aaa1160 . PMID   25953820. S2CID   206632821.
  63. Barberá, Pabló (August 2015). "How Social Media Reduces Mass Political Polarization. Evidence from Germany, Spain, and the U.S." CiteSeerX   10.1.1.658.5476 .
  64. Hilbert, M., Ahmed, S., Cho, J., Liu, B., & Luu, J. (2018). Communicating with Algorithms: A Transfer Entropy Analysis of Emotions-based Escapes from Online Echo Chambers. Communication Methods and Measures, 12(4), 260–275. https://doi.org/10.1080/19312458.2018.1479843 Archived 2021-01-19 at the Wayback Machine  ; https://www.martinhilbert.net/communicating-with-algorithms/ Archived 2019-05-09 at the Wayback Machine
  65. 1 2 Bail, Christopher; Argyle, Lisa; Brown, Taylor; Chen, Haohan; Hunzaker, M.B.F.; Lee, Jaemin (2018). "Exposure to opposing views on social media can increase political polarization" (PDF). Proceedings of the National Academy of Sciences. 115 (37): 9216–9221. Bibcode:2018PNAS..115.9216B. doi: 10.1073/pnas.1804840115 . PMC   6140520 . PMID   30154168. Archived (PDF) from the original on April 10, 2020. Retrieved April 22, 2020.
  66. 1 2 Min, Yong; Jiang, Tingjun; Jin, Cheng; Li, Qu; Jin, Xiaogang (2019). "Endogenetic structure of filter bubble in social networks". Royal Society Open Science. 6 (11): 190868. arXiv: 1907.02703 . Bibcode:2019RSOS....690868M. doi:10.1098/rsos.190868. PMC   6894573 . PMID   31827834.
  67. Statt, Nick (December 4, 2018). "Google personalizes search results even when you're logged out, new study claims". The Verge. Archived from the original on July 31, 2020. Retrieved April 22, 2020.
  68. Bucher, Taina (February 25, 2016). "The algorithmic imaginary: exploring the ordinary effects of Facebook algorithms". Information, Communication & Society. 20 via Taylor & Francis Online.
  69. "How do we break filter bubble and design for democracy?". March 3, 2017. Archived from the original on March 3, 2017. Retrieved March 3, 2017.
  70. ""Filterblase" ist das Wort des Jahres 2016". December 7, 2016. Archived from the original on December 20, 2016. Retrieved December 27, 2016.
  71. Eli Pariser (May 2011). The Filter Bubble: What the Internet Is Hiding from You . New York: Penguin Press. p.  17. ISBN   978-1-59420-300-8.
  72. Stephen Baron; John Field; Tom Schuller (November 30, 2000). "Social capital: A review and critique.". Social Capital: Critical perspectives. Oxford University Press. ISBN   9780199243679.
  73. "Are we stuck in filter bubbles? Here are five potential paths out". Nieman Lab. Archived from the original on March 4, 2017. Retrieved March 3, 2017.
  74. Glushko, Chris (February 8, 2017). "Pop the Personalization Filter Bubbles and Preserve Online Diversity". Marketing Land. Archived from the original on March 15, 2017. Retrieved May 22, 2017.
  75. Ritholtz, Barry (February 2, 2017). "Try Breaking Your Media Filter Bubble". Bloomberg. Archived from the original on August 21, 2017. Retrieved May 22, 2017.
  76. "A news app aims to burst filter bubbles by nudging readers toward a more "balanced" media diet". Nieman Lab. Archived from the original on May 15, 2017. Retrieved May 24, 2017.
  77. Mădălina Ciobanu (March 3, 2017). "NZZ is developing an app that gives readers personalised news without creating a filter bubble: The app uses machine learning to give readers a stream of 25 stories they might be interested in based on their preferences, but 'always including an element of surprise'". Journalism.co.uk. Archived from the original on March 3, 2017. Retrieved March 3, 2017. ... if, based on their consumption history, someone has not expressed an interest in sports, their stream will include news about big, important stories related to sports,...
  78. Catalina Albeanu (November 17, 2016). "Bursting the filter bubble after the US election: Is the media doomed to fail? At an event in Brussels this week, media and politicians discussed echo chambers on social media and the fight against fake news". Journalism.co.uk. Archived from the original on March 10, 2017. Retrieved March 3, 2017. ... EU referendum in the UK on a panel at the "Politicians in a communication storm" event... On top of the filter bubble, partisan Facebook pages also served up a diet heavy in fake news....
  79. "European Commission". Archived from the original on March 4, 2017. Retrieved March 3, 2017.
  80. Resnick, Paul; Garrett, R. Kelly; Kriplean, Travis; Munson, Sean A.; Stroud, Natalie Jomini (2013). "Bursting your (Filter) bubble". Proceedings of the 2013 conference on Computer supported cooperative work companion - CSCW '13. p. 95. doi:10.1145/2441955.2441981. ISBN   978-1-4503-1332-2. S2CID   20865375.
  81. 1 2 Vanian, Jonathan (April 25, 2017). "Facebook Tests Related Articles Feature to Fight Filter Bubbles". Fortune.com . Archived from the original on September 25, 2017. Retrieved September 24, 2017.
  82. Sydell, Laura (January 25, 2017). "Facebook Tweaks its 'Trending Topics' Algorithm to Better Reflect Real News". KQED Public Media. NPR. Archived from the original on February 26, 2018. Retrieved April 5, 2018.
  83. Hao, Karen. "Google is finally admitting it has a filter-bubble problem". Quartz. Archived from the original on May 4, 2018. Retrieved May 30, 2018.
  84. "Facebook, Mozilla and Craigslist Craig fund fake news firefighter". Archived from the original on November 23, 2018. Retrieved January 14, 2019.
  85. "The Mozilla Information Trust Initiative: Building a movement to fight misinformation online". The Mozilla Blog. Archived from the original on January 14, 2019. Retrieved January 14, 2019.
  86. 1 2 3 Bozdag, Engin; Timmerman, Job. "Values in the filter bubble Ethics of Personalization Algorithms in Cloud Computing". ResearchGate. Archived from the original on December 14, 2020. Retrieved March 6, 2017.
  87. Al-Rodhan, Nayef. "The Many Ethical Implications of Emerging Technologies". Scientific American. Archived from the original on April 8, 2017. Retrieved March 6, 2017.
  88. Haim, Mario; Graefe, Andreas; Brosius, Hans-Bernd (March 16, 2018). "Burst of the Filter Bubble?". Digital Journalism. 6 (3): 330–343. doi: 10.1080/21670811.2017.1338145 . S2CID   168906316.
  89. "The Filter Bubble raises important issues – You just need to filter them out for yourself". Rainforest Action Network. Archived from the original on April 8, 2017. Retrieved March 6, 2017.
  90. 1 2 Sterling, Greg (February 20, 2017). "Mark Zuckerberg's manifesto: How Facebook will connect the world, beat fake news and pop the filter bubble". Marketing Land. Archived from the original on March 8, 2017. Retrieved March 6, 2017.
  91. 1 2 Morozov, Evgeny (June 10, 2011). "Your Own Facts". The New York Times. Archived from the original on March 4, 2017. Retrieved March 6, 2017.
  92. Hesse, Bradford W.; Nelson, David E.; Kreps, Gary L.; Croyle, Robert T.; Arora, Neeraj K.; Rimer, Barbara K.; Viswanath, Kasisomayajula (December 12, 2005). "Trust and Sources of Health Information: The Impact of the Internet and Its Implications for Health Care Providers: Findings From the First Health Information National Trends Survey". Archives of Internal Medicine. 165 (22): 2618–24. doi:10.1001/archinte.165.22.2618. PMID   16344419.
  93. El-Bermawy, Mostafa (November 18, 2016). "Your filter bubble is destroying democracy". Wired. Archived from the original on March 9, 2017. Retrieved March 6, 2017.
  94. "How to Burst the "Filter Bubble" that Protects Us from Opposing Views". MIT Technology Review. Archived from the original on January 19, 2021. Retrieved March 6, 2017.
  95. Borgesius, Frederik; Trilling, Damian; Möller, Judith; Bodó, Balázs; de Vreese, Claes; Helberger, Natali (March 31, 2016). "Should we worry about filter bubbles?". Internet Policy Review. Archived from the original on March 20, 2017. Retrieved March 6, 2017.
  96. Pariser, Eli (2011). The Filter Bubble: How the New Personalized Web is Changing What We Read and How We Think. New York: Penguin Press. ISBN   978-1-59420-300-8.
  97. "In praise of serendipity". The Economist. March 9, 2017. Archived from the original on January 15, 2019. Retrieved January 14, 2019.
  98. Reviglio, Urbano (June 2019). "Serendipity as an emerging design principle of the infosphere: challenges and opportunities". Ethics and Information Technology. 21 (2): 151–166. doi:10.1007/s10676-018-9496-y. S2CID   57426650.
  99. Harambam, Jaron; Helberger, Natali; van Hoboken, Joris (November 28, 2018). "Democratizing algorithmic news recommenders: how to materialize voice in a technologically saturated media ecosystem". Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences. 376 (2133): 20180088. Bibcode:2018RSPTA.37680088H. doi:10.1098/rsta.2018.0088. PMC   6191663 . PMID   30323002.
  100. Herrman, John (August 24, 2016). "Inside Facebook's (Totally Insane, Unintentionally Gigantic, Hyperpartisan) Political-Media Machine". The New York Times. Archived from the original on October 19, 2017. Retrieved October 24, 2017.
  101. Del Vicario, Michela; Bessi, Alessandro; Zollo, Fabiana; Petroni, Fabio; Scala, Antonio; Caldarelli, Guido; Stanley, H. Eugene; Quattrociocchi, Walter (January 19, 2016). "The spreading of misinformation online". Proceedings of the National Academy of Sciences. 113 (3): 554–559. Bibcode:2016PNAS..113..554D. doi: 10.1073/pnas.1517441113 . PMC   4725489 . PMID   26729863.
  102. Granville, Kevin (March 19, 2018). "Facebook and Cambridge Analytica: What You Need to Know as Fallout Widens". The New York Times. Archived from the original on October 19, 2018. Retrieved October 19, 2018.
  103. Meredith, Sam (April 10, 2018). "Facebook-Cambridge Analytica: A timeline of the data hijacking scandal". CNBC . Archived from the original on October 19, 2018. Retrieved October 19, 2018.
  104. 1 2 Gross, Michael (January 2017). "The dangers of a post-truth world". Current Biology. 27 (1): R1–R4. Bibcode:2017CBio...27...R1G. doi: 10.1016/j.cub.2016.12.034 .
  105. "How Filter Bubbles Distort Reality: Everything You Need to Know". Farnam Street. July 31, 2017. Archived from the original on May 20, 2019. Retrieved May 21, 2019.
  106. Dish, The Daily (October 10, 2010). "The Filter Bubble". The Atlantic. Archived from the original on August 22, 2017. Retrieved May 21, 2019.
  107. 1 2 "Filter Bubbles & Confirmation Bias - Fake News (And how to fight it) - LibGuides at Miami Dade College Learning Resources". Archived from the original on October 23, 2020. Retrieved October 22, 2020.

Further reading