Type of site | |
---|---|
Available in | Multiple languages |
Owner | Wikimedia Foundation |
Editor | Wikimedia community |
URL | wikidata |
Commercial | No |
Registration | Optional |
Launched | 29 October 2012 [1] |
Wikidata is a collaboratively edited multilingual knowledge graph hosted by the Wikimedia Foundation. [2] It is a common source of open data that Wikimedia projects such as Wikipedia, [3] [4] and anyone else, is able to use under the CC0 public domain license. Wikidata is a wiki powered by the software MediaWiki, including its extension for semi-structured data, the Wikibase.
Wikidata is a document-oriented database, focusing on items, which represent any kind of topic, concept, or object. Each item is allocated a unique, persistent identifier, a positive integer prefixed with the upper-case letter Q, known as a "QID". Q is the first name of Qamarniso Vrandečić (née Ismoilova), an Uzbek Wikimedian married to the Wikidata co-developer Denny Vrandečić. [5] This enables the basic information required to identify the topic that the item covers to be translated without favouring any language.
Examples of items include 1988 Summer Olympics (Q8470), love (Q316), Johnny Cash (Q42775), Elvis Presley (Q303), and Gorilla (Q36611).
Item labels do not need to be unique. For example, there are two items named "Elvis Presley": Elvis Presley (Q303), which represents the American singer and actor, and Elvis Presley (Q610926), which represents his self-titled album. However, the combination of a label and its description must be unique. To avoid ambiguity, an item's unique identifier (QID) is hence linked to this combination.
Fundamentally, an item consists of:
Statements are how any information known about an item is recorded in Wikidata. Formally, they consist of key–value pairs, which match a property (such as "author", or "publication date") with one or more entity values (such as "Sir Arthur Conan Doyle" or "1902"). For example, the informal English statement "milk is white" would be encoded by a statement pairing the property color (P462) with the value white (Q23444) under the item milk (Q8495).
Statements may map a property to more than one value. For example, the "occupation" property for Marie Curie could be linked with the values "physicist" and "chemist", to reflect the fact that she engaged in both occupations. [6]
Values may take on many types including other Wikidata items, strings, numbers, or media files. Properties prescribe what types of values they may be paired with. For example, the property official website (P856) may only be paired with values of type "URL". [7]
Optionally, qualifiers can be used to refine the meaning of a statement by providing additional information. For example, a "population" statement could be modified with a qualifier such as "point in time (P585): 2011" (as its own key-value pair). Values in the statements may also be annotated with references, pointing to a source backing up the statement's content. [8] As with statements, all qualifiers and references are property–value pairs.
Each property has a numeric identifier prefixed with a capital P and a page on Wikidata with optional label, description, aliases, and statements. As such, there are properties with the sole purpose of describing other properties, such as subproperty of (P1647).
Properties may also define more complex rules about their intended usage, termed constraints. For example, the capital (P36) property includes a "single value constraint", reflecting the reality that (typically) territories have only one capital city. Constraints are treated as testing alerts and hints, rather than inviolable rules. [9]
Before a new property is created, it needs to undergo a discussion process. [10] [11]
The most used property is cites work (P2860), which is used on more than 290,000,000 item pages as of November 2023. [update] [12]
In linguistics, a lexeme is a unit of lexical meaning. Similarly, Wikidata's lexemes are items with a structure that makes them more suitable to store lexicographical data. Besides storing the language to which the lexeme refers, they have a section for forms and a section for senses. [13]
In January 2019, development started of a new extension for MediaWiki to enable storing Shape Expressions in a separate namespace. [14] [15]
This extension has since been installed on Wikidata [16] and enables contributors to use Shape Expressions for validating and describing Resource Description Framework data in items and lexemes. Any item or lexeme on Wikidata can be validated against an Entity Schema, and this makes it an important tool for quality assurance.
The creation of the project was funded by donations from the Allen Institute for Artificial Intelligence, the Gordon and Betty Moore Foundation, and Google, Inc., totaling €1.3 million. [17] [18] The development of the project is mainly driven by Wikimedia Deutschland under the management of Lydia Pintscher, and was originally split into three phases: [19]
Wikidata was launched on 29 October 2012 and was the first new project of the Wikimedia Foundation since 2006. [3] [20] [21] At this time, only the centralization of language links was available. This enabled items to be created and filled with basic information: a label – a name or title, aliases – alternative terms for the label, a description, and links to articles about the topic in all the various language editions of Wikipedia (interwikipedia links).
Historically, a Wikipedia article would include a list of interlanguage links (links to articles on the same topic in other editions of Wikipedia, if they existed). Wikidata was originally a self-contained repository of interlanguage links. [22] Wikipedia language editions were still not able to access Wikidata, so they needed to continue to maintain their own lists of interlanguage links.[ citation needed ]
On 14 January 2013, the Hungarian Wikipedia became the first to enable the provision of interlanguage links via Wikidata. [23] This functionality was extended to the Hebrew and Italian Wikipedias on 30 January, to the English Wikipedia on 13 February and to all other Wikipedias on 6 March. [24] [25] [26] [27] After no consensus was reached over a proposal to restrict the removal of language links from the English Wikipedia, [28] they were automatically removed by bots. On 23 September 2013, interlanguage links went live on Wikimedia Commons. [29]
On 4 February 2013, statements were introduced to Wikidata entries. The possible values for properties were initially limited to two data types (items and images on Wikimedia Commons), with more data types (such as coordinates and dates) to follow later. The first new type, string, was deployed on 6 March. [30]
The ability for the various language editions of Wikipedia to access data from Wikidata was rolled out progressively between 27 March and 25 April 2013. [31] [32] On 16 September 2015, Wikidata began allowing so-called arbitrary access, or access from a given article of a Wikipedia to the statements on Wikidata items not directly connected to it. For example, it became possible to read data about Germany from the Berlin article, which was not feasible before. [33] On 27 April 2016 arbitrary access was activated on Wikimedia Commons. [34]
According to a 2020 study, a large proportion of the data on Wikidata consists of entries imported en masse from other databases by Internet bots, which helps to "break down the walls" of data silos. [35]
On 7 September 2015, the Wikimedia Foundation announced the release of the Wikidata Query Service, [36] which lets users run queries on the data contained in Wikidata. [37] The service uses SPARQL as the query language. As of November 2018, there are at least 26 different tools that allow querying the data in different ways. [38] It uses Blazegraph as its triplestore and graph database. [39] [40]
In 2021 Wikimedia Deutschland released the Query Builder, [41] "a form-based query builder to allow people who don't know how to use SPARQL to" write a query.
The bars on the logo contain the word "WIKI" encoded in Morse code. [42] It was created by Arun Ganesh and selected through community decision. [43]
In November 2014, Wikidata received the Open Data Publisher Award from the Open Data Institute "for sheer scale, and built-in openness". [44]
In December 2014, Google announced that it would shut down Freebase in favor of Wikidata. [45]
As of November 2018 [update] , Wikidata information was used in 58.4% of all English Wikipedia articles, mostly for external identifiers or coordinate locations. In aggregate, data from Wikidata is shown in 64% of all Wikipedias' pages, 93% of all Wikivoyage articles, 34% of all Wikiquotes', 32% of all Wikisources', and 27% of Wikimedia Commons's. Usage in other Wikimedia Foundation projects is a testimonial. [46]
As of December 2020 [update] , Wikidata's data was visualized by at least 20 other external tools [47] and over 300 papers have been published about Wikidata. [48]
Wikidata's structured dataset has been used by virtual assistants such as Apple's Siri and Amazon Alexa. [49]
A systematic literature review of the uses of Wikidata in research was carried in 2019. [54]
Wikipedia, a free-content online encyclopedia written and maintained by a community of volunteers known as Wikipedians, began with its first edit on 15 January 2001, two days after the domain was registered. It grew out of Nupedia, a more structured free encyclopedia, as a way to allow easier and faster drafting of articles and translations.
The Resource Description Framework (RDF) is a World Wide Web Consortium (W3C) standard originally designed as a data model for metadata. It has come to be used as a general method for description and exchange of graph data. RDF provides a variety of syntax notations and data serialization formats, with Turtle currently being the most widely used notation.
Wiktionary is a multilingual, web-based project to create a free content dictionary of terms in all natural languages and in a number of artificial languages. These entries may contain definitions, images for illustration, pronunciations, etymologies, inflections, usage examples, quotations, related terms, and translations of terms into other languages, among other features. It is collaboratively edited via a wiki. Its name is a portmanteau of the words wiki and dictionary. It is available in 192 languages and in Simple English. Like its sister project Wikipedia, Wiktionary is run by the Wikimedia Foundation, and is written collaboratively by volunteers, dubbed "Wiktionarians". Its wiki software, MediaWiki, allows almost anyone with access to the website to create and edit entries.
MediaWiki is free and open-source wiki software originally developed by Magnus Manske for use on Wikipedia on January 25, 2002, and further improved by Lee Daniel Crocker, after which it has been coordinated by the Wikimedia Foundation. It powers several wiki hosting websites across the Internet, as well as most websites hosted by the Foundation including Wikipedia, Wiktionary, Wikimedia Commons, Wikiquote, Meta-Wiki and Wikidata, which define a large part of the set requirements for the software. MediaWiki is written in the PHP programming language and stores all text content into a database. The software is optimized to efficiently handle large projects, which can have terabytes of content and hundreds of thousands of views per second. Because Wikipedia is one of the world's largest and most visited websites, achieving scalability through multiple layers of caching and database replication has been a major concern for developers. Another major aspect of MediaWiki is its internationalization; its interface is available in more than 400 languages. The software has more than 1,000 configuration settings and more than 1,800 extensions available for enabling various features to be added or changed. Besides its usage on Wikimedia sites, MediaWiki has been used as a knowledge management and content management system on websites such as Fandom, wikiHow and major internal installations like Intellipedia and Diplopedia.
SPARQL is an RDF query language—that is, a semantic query language for databases—able to retrieve and manipulate data stored in Resource Description Framework (RDF) format. It was made a standard by the RDF Data Access Working Group (DAWG) of the World Wide Web Consortium, and is recognized as one of the key technologies of the semantic web. On 15 January 2008, SPARQL 1.0 was acknowledged by W3C as an official recommendation, and SPARQL 1.1 in March, 2013.
A semantic wiki is a wiki that has an underlying model of the knowledge described in its pages. Regular, or syntactic, wikis have structured text and untyped hyperlinks. Semantic wikis, on the other hand, provide the ability to capture or identify information about the data within pages, and the relationships between pages, in ways that can be queried or exported like a database through semantic queries.
Heinrich Magnus Manske is a German biochemist, who is a leading researcher on malaria. He is a senior staff scientist at the Wellcome Sanger Institute in Cambridge, UK and a software developer of one of the first versions of the MediaWiki software, which powers Wikipedia and a number of other wiki-based websites.
The Wikimedia movement is the global community of contributors to the Wikimedia projects, including Wikipedia. This community directly builds and administers these projects with the commitment of achieving this using open standards and software.
The Hungarian Wikipedia is the Hungarian/Magyar version of Wikipedia, the free encyclopedia. Started on 8 July 2003, this version reached the 300,000-article milestone in May 2015. The 500,000th article was born on 16 February 2022. As of 12 March 2024, this edition has 539,338 articles and is the 29th largest Wikipedia edition.
Semantic MediaWiki (SMW) is an extension to MediaWiki that allows for annotating semantic data within wiki pages, thus turning a wiki that incorporates the extension into a semantic wiki. Data that has been encoded can be used in semantic searches, used for aggregation of pages, displayed in formats like maps, calendars and graphs, and exported to the outside world via formats like RDF and CSV.
DBpedia is a project aiming to extract structured content from the information created in the Wikipedia project. This structured information is made available on the World Wide Web using OpenLink Virtuoso. DBpedia allows users to semantically query relationships and properties of Wikipedia resources, including links to other related datasets.
The history of wikis began in 1994, when Ward Cunningham gave the name "WikiWikiWeb" to the knowledge base, which ran on his company's website at c2.com, and the wiki software that powered it. The wiki went public in March 1995, the date used in anniversary celebrations of the wiki's origins. c2.com is thus the first true wiki, or a website with pages and links that can be easily edited via the browser, with a reliable version history for each page. He chose "WikiWikiWeb" as the name based on his memories of the "Wiki Wiki Shuttle" at Honolulu International Airport, and because "wiki" is the Hawaiian word for "quick".
The Wikimedia Foundation, Inc., abbreviated WMF, is an American 501(c)(3) nonprofit organization headquartered in San Francisco, California, and registered there as a charitable foundation. It is best known as the host of Wikipedia, the seventh most visited website in the world. However, the foundation also hosts 14 other related content projects. It also supports the development of MediaWiki, the wiki software that underpins them all.
Wikimedia Commons, or simply Commons, is a wiki-based media repository of free-to-use images, sounds, videos and other media. It is a project of the Wikimedia Foundation.
Freebase was a large collaborative knowledge base consisting of data composed mainly by its community members. It was an online collection of structured data harvested from many sources, including individual, user-submitted wiki contributions. Freebase aimed to create a global resource that allowed people to access common information more effectively. It was developed by the American software company Metaweb and run publicly beginning in March 2007. Metaweb was acquired by Google in a private sale announced on 16 July 2010. Google's Knowledge Graph is powered in part by Freebase.
The following outline is provided as an overview of and a topical guide to Wikipedia:
translatewiki.net, formerly named Betawiki, is a web-based translation platform powered by the Translate extension for MediaWiki. It can be used to translate various kinds of texts but is commonly used for creating localisations for software interfaces.
The Volapük Wikipedia is the Volapük-language edition of the free online encyclopedia Wikipedia. It was created in February 2003, but launched in January 2004. As of 12 March 2024, it is the 108th-largest Wikipedia as measured by the number of articles, with about 36,000 articles, and the second-largest Wikipedia in a constructed language after the Esperanto Wikipedia.
Wikibase is a set of MediaWiki extensions for working with versioned semi-structured data in a central repository. It is based upon JSON instead of the unstructured data of wikitext normally used in MediaWiki. Its primary components are the Wikibase Repository, an extension for storing and managing data, and the Wikibase Client which allows for the retrieval and embedding of structured data from a Wikibase repository. It was developed for and is used by Wikidata, by Wikimedia Deutschland.
Abstract Wikipedia is an in-development project of the Wikimedia Foundation. It aims to use Wikifunctions to create a language-independent version of Wikipedia using its structured data. First conceived in 2020, Abstract Wikipedia has been under active development ever since, with the related project of Wikifunctions launched successfully in 2023. Nevertheless, the project has proved controversial. As envisioned, Abstract Wikipedia would consist of "Constructors", "Content", and "Renderers".
Since Wikidata.org went live on 30 October 2012,