![]() | |
Founded | 2015 |
---|---|
Headquarters | Greensboro, North Carolina |
Owner | Dave M. Van Zandt [1] |
URL | mediabiasfactcheck |
Current status | Active |
Media Bias/Fact Check (MBFC) is an American website founded in 2015 by Dave M. Van Zandt. [1] It considers four main categories and multiple subcategories in assessing the "political bias" and "factual reporting" of media outlets, [2] [3] relying on a self-described "combination of objective measures and subjective analysis". [4] [5]
It is widely used, but it has been criticized for its methodology. [6] Scientific studies [7] using its ratings note that ratings from Media Bias/Fact Check show high agreement with an independent fact checking dataset from 2017, [8] with NewsGuard [9] and with BuzzFeed journalists. [10]
Four main categories are used by MBFC to assess political bias and factuality of a source. These are: (1) use of wording and headlines (2) fact-checking and sourcing (3) choice of stories and (4) political affiliation. MBFC additionally considers subcategories such as bias by omission, bias by source selection, and loaded use of language. [2] [11] A source's "Factual Reporting" is rated on a seven-point scale from "Very high" down to "Very low". [12]
Political bias ratings are U.S.-centric, [11] [13] and are "extreme-left", "left", "left-center", "least biased", "right-center", "right", and "extreme-right". [14] The category "Pro-science" [3] is used to indicate "evidence based" or "legitimate science". MBFC also associates sources with warning categories such as "Conspiracy/Pseudoscience", "Questionable Sources" and "Satire". [3]
Fact checks are carried out by independent reviewers who are associated with the International Fact-Checking Network (IFCN) and follow the International Fact-Checking Network Fact-checkers’ Code of Principles, which was developed by the Poynter Institute. [15] [11] A source may be credited with high "Factual Reporting" and still show "Political bias" in its presentation of those facts, for example, through its use of emotional language. [16] [17] [18] Only failed fact checks and "confirmed cases of misinformation" that have occurred within the past five years can be counted against a source. [11]
According to the methodology, an evaluation requires "a minimum of 10 headlines and 5 full news stories from each source" to be reviewed. [11]
Media Bias/Fact Check has been used in studies of mainstream media, social media, and disinformation, [19] [8] [20] [21] among them single- and cross-platform studies of services including TikTok, 4chan, Reddit, Lemmy, Twitter, Facebook, Instagram, and Google Web Search. [22]
Scientific studies [7] using its ratings note that ratings from Media Bias/Fact Check show high agreement with an independent fact checking dataset from 2017, [8] with NewsGuard [9] and with BuzzFeed journalists. [10] When MBFC factualness ratings of ‘mostly factual’ or higher were compared to an independent fact checking dataset's ‘verified’ and ‘suspicious’ news sources, the two datasets showed "almost perfect" inter-rater reliability. [8] [20] [23] A 2022 study that evaluated sharing of URLs on Twitter and Facebook in March and April 2020 and 2019, to compare the prevalence of misinformation, reports that scores from Media Bias/Fact Check correlate strongly with those from NewsGuard (r = 0.81). [9]
A comparison of five fact checking datasets frequently used as "groundtruth lists" has suggested that choosing one groundtruth list over another has little impact on the evaluation of online content. [8] [20] In some cases, MBFC has been selected because it categorizes sources using a larger range of labels than other rating services. [8] MBFC offers the largest dataset covering biased and low factual news sources. Over a 4-year span, the percentage of links that could be categorized with MBFC was found to be very consistent. Research also suggests that the bias and factualness of a news source are unlikely to change over time. [8] [20]
The site has been used by researchers at the University of Michigan to create a tool called the "Iffy Quotient", which draws data from Media Bias/Fact Check and NewsWhip to track the prevalence of "fake news" and questionable sources on social media. [24] [25]
A 2018 year-in-review and prospective on fact-checking from the Poynter Institute (which develops PolitiFact [26] ) noted a proliferation of credibility score projects, including Media/Bias Fact Check, writing that "While these projects are, in theory, a good addition to the efforts combating misinformation, they have the potential to misfire," and stating that "Media Bias/Fact Check is a widely cited source for news stories and even studies about misinformation, despite the fact that its method is in no way scientific." [27] Also in 2018, a writer in the Columbia Journalism Review described Media Bias/Fact Check as "an armchair media analysis" [6] and characterized their assessments as "subjective assessments [that] leave room for human biases, or even simple inconsistencies, to creep in". [28] A study published in Scientific Reports wrote: "While [Media Bias/Fact Check's] credibility is sometimes questioned, it has been regarded as accurate enough to be used as ground-truth for e.g. media bias classifiers, fake news studies, and automatic fact-checking systems." [19]
{{cite journal}}
: CS1 maint: article number as page number (link){{cite journal}}
: CS1 maint: article number as page number (link)Despite the varied labeling and validation procedures used and domains listed by fake news annotators, the groundtruth selection has a limited to modest impact on studies reporting on the behaviors of fake news sites
{{cite journal}}
: CS1 maint: article number as page number (link){{cite journal}}
: CS1 maint: article number as page number (link)