Wikidata

Last updated
Wikidata
Wikidata-logo-en.svg
Wikidata main page (2019).png
Main page of Wikidata in November 2019
Type of site
Available inMultiple languages
Founded29 October 2012;8 years ago (2012-10-29) [1]
EditorWikidata editors
URL www.wikidata.org OOjs UI icon edit-ltr-progressive.svg
CommercialNo
RegistrationOptional

Wikidata is a collaboratively edited multilingual knowledge graph hosted by the Wikimedia Foundation. It is a common source of open data that Wikimedia projects such as Wikipedia, [2] [3] and anyone else, can use under a public domain license. Wikidata is powered by the software Wikibase. [4]

Contents

Concept

This diagram shows the most important terms used in Wikidata. Datamodel in Wikidata.svg
This diagram shows the most important terms used in Wikidata.

Wikidata is a document-oriented database, focused on items, which represent topics, concepts, or objects. Each item is allocated a unique, persistent identifier, a positive integer prefixed with the upper-case letter Q, known as a "QID". This enables the basic information required to identify the topic that the item covers to be translated without favouring any language.

Examples of items include 1988 Summer Olympics (Q8470), love (Q316), Elvis Presley (Q303), and Gorilla (Q36611).

Item labels need not be unique. For example, there are two items named "Elvis Presley": Elvis Presley (Q303) represents the American singer and actor, and Elvis Presley (Q610926) represents his self-titled album.

But the label and the description text needs to be unique together. So, an Item is related with a unique identifier (QID). An identifier is linked to a pair: a label and a description, to dissolve any ambiguity.

Item types are general and lexemes.

Main parts

A layout of the four main components of a phase-1 Wikidata page: the label, description, aliases and interlanguage links. Wikidata layout Phase I.png


A layout of the four main components of a phase-1 Wikidata page: the label, description, aliases and interlanguage links.

Fundamentally, an item consists of:

We are going to present them, from highest to lowest ones.

Statements

Three statements from Wikidata's item on the planet Mars (Q111). Values include links to other items and to Wikimedia Commons. Wikidata statements Mars.png
Three statements from Wikidata's item on the planet Mars (Q111). Values include links to other items and to Wikimedia Commons.

Statements are how any information known about an item is recorded in Wikidata. Formally, they consist of key-value pairs, which match a property (such as "author", or "publication date") with one or more entity values (such as "Sir Arthur Conan Doyle" or "1902"). For example, the informal English statement "milk is white" would be encoded by a statement pairing the property color (P462) with the value white (Q23444) under the item milk (Q8495).

Statements may map a property to more than one value. For example, the "occupation" property for Marie Curie could be linked with the values "physicist" and "chemist", to reflect the fact that she engaged in both occupations. [5]

Values may take on many types including other Wikidata items, strings, numbers, or media files. Properties prescribe what types of values they may be paired with. For example, the property official website (P856) may only be paired with values of type "URL". [6]

Property and value

Example of a simple statement consisting of one property-value pair Simple statement - Earth.png
Example of a simple statement consisting of one property-value pair

A property describes the data value of a statement and can be thought of as a category of data, for example color (P462) for the data value blue (Q1088) or education for a person item.

As said, properties, when paired with values, form a statement in Wikidata. Values can include qualifiers.

The most used property is instance of (P31), that is used on more than 95,000,000 item pages. [7]

Properties have their own pages on Wikidata and as an item can include several properties, this results in a linked data structure of pages, under the same statement.

Properties may also define more complex rules about their intended usage, termed constraints. For example, the capital (P36) property includes a "single value constraint", reflecting the reality that (typically) territories have only one capital city. Constraints are treated as testing alerts and hints, rather than inviolable rules. [8]

Optionally, qualifiers can be used to refine the meaning of a statement by providing additional information that applies to the scope of the statement, within the values. For example, the property "population" could be modified with a qualifier such as "as of 2011". Values in the statements may also be annotated with references, pointing to a source backing up the statement's content. [9]

Lexemes

In linguistics, a lexeme is a unit of lexical meaning. Similarly, Wikidata's lexemes are items with a structure that makes them more suitable to store lexicographical data. Besides storing the language to which the lexeme refers, they have a section for forms and a section for senses. [10]

Development

Wikidata Birthday Celebration at Kerala

The creation of the project was funded by donations from the Allen Institute for Artificial Intelligence, the Gordon and Betty Moore Foundation, and Google, Inc., totaling 1.3 million. [11] [12] The development of the project is mainly driven by Wikimedia Deutschland under the management of Lydia Pintscher, and was originally split into three phases: [13]

  1. Centralising interlanguage links – links between Wikipedia articles about the same topic in different languages.
  2. Providing a central place for infobox data for all Wikipedias.
  3. Creating and updating list articles based on data in Wikidata and linking to other Wikimedia sister projects, including Meta-Wiki and the own Wikidata (interwikilinks).

Initial rollout

A Wikipedia article's list of interlanguage links as they appeared in an edit box (left) and on the article's page (right) prior to Wikidata. Each link in these lists is to an article that requires its own list of interlanguage links to the other articles; this is the information centralized by Wikidata. Interlanguage links prior to Wikidata.png


A Wikipedia article's list of interlanguage links as they appeared in an edit box (left) and on the article's page (right) prior to Wikidata. Each link in these lists is to an article that requires its own list of interlanguage links to the other articles; this is the information centralized by Wikidata.
The "Edit links" link nowadays takes the reader to Wikidata to edit interlanguage and interwiki links. Interlanguage links provided by WikiData.png
The "Edit links" link nowadays takes the reader to Wikidata to edit interlanguage and interwiki links.

Wikidata was launched on 29 October 2012 and was the first new project of the Wikimedia Foundation since 2006. [2] [14] [15] At this time, only the centralization of language links was available. This enabled items to be created and filled with basic information: a label – a name or title, aliases – alternative terms for the label, a description, and links to articles about the topic in all the various language editions of Wikipedia (interwikipedia links).

Historically, a Wikipedia article would include a list of interlanguage links, being links to articles on the same topic in other editions of Wikipedia, if they existed. Initially, Wikidata was a self-contained repository of interlanguage links. Wikipedia language editions were still not able to access Wikidata, so they needed to continue to maintain their own lists of interlanguage links, mainly at the end of the articles' pages.[ citation needed ]

On 14 January 2013, the Hungarian Wikipedia became the first to enable the provision of interlanguage links via Wikidata. [16] This functionality was extended to the Hebrew and Italian Wikipedias on 30 January, to the English Wikipedia on 13 February and to all other Wikipedias on 6 March. [17] [18] [19] [20] After no consensus was reached over a proposal to restrict the removal of language links from the English Wikipedia, [21] the power to delete them from the English Wikipedia was granted to automatic editors (bots). On 23 September 2013, interlanguage links went live on Wikimedia Commons. [22]

Statements and data access

On 4 February 2013, statements were introduced to Wikidata entries. The possible values for properties were initially limited to two data types (items and images on Wikimedia Commons), with more data types (such as coordinates and dates) to follow later. The first new type, string, was deployed on 6 March. [23]

The ability for the various language editions of Wikipedia to access data from Wikidata was rolled out progressively between 27 March and 25 April 2013. [24] [25]

On 16 September 2015, Wikidata began allowing so-called arbitrary access, or access from a given Wikidata item to the properties of items not directly connected to it. For example, it became possible to read data about Germany from the Berlin article, which was not feasible before. [26] On 27 April 2016 arbitrary access was activated on Wikimedia Commons. [27]

Query service and other improvements

On 7 September 2015, the Wikimedia Foundation announced the release of the Wikidata Query Service, [28] which lets users run queries on the data contained in Wikidata. [29] The service uses SPARQL as the query language. As of November 2018, there are at least 26 different tools that allow to query the data in different ways. [30]

On the other hand, in the Wiktionary lateral pane, the tools now include[ when? ] a "Wikidata item" to help create a new item and links to new pages.[ citation needed ] For example, this is useful when the item is only in the English Wiktionary and needs to be linked to another Wikimedia project, rather than to Wiktionaries in other languages.

The Wikidata query service can be used as an open source alternative to IMDb's Movie Keyword Analyzer (MoKA) to search films or television series by plot keywords [31] and to find films or television series where 2 actors played together [32] .

Below is a SPARQL example to search an instance of (P31) television series (Q5398426) with main subject (P921) about island (Q23442) and aviation accident (Q744913). However similar results can also be found directly on Wikipedia using category intersections if the appropriate categories exist and are allowed.

SELECT?item?itemLabelWHERE{?itemwdt:P31wd:Q5398426.?itemwdt:P921wd:Q23442.?itemwdt:P921wd:Q744913.SERVICEwikibase:label{bd:serviceParamwikibase:language"[AUTO_LANGUAGE],en".}}

Below is another SPARQL example to find an instance of (P31) television series (Q5398426) where cast member (P161) includes Daniel Dae Kim (Q299700) and Jorge Garcia (Q264914). The television series condition prevents displaying a television series episode (Q21191270) / two-part episode (Q21664088) and does not show results that are a film (Q11424).

SELECT?item?itemLabelWHERE{?itemwdt:P31wd:Q5398426.?itemwdt:P161wd:Q299700.?itemwdt:P161wd:Q264914.SERVICEwikibase:label{bd:serviceParamwikibase:language"[AUTO_LANGUAGE],en".}}

The bars on the logo contain the word "WIKI" encoded in Morse code. [33] It was created by Arun Ganesh and selected through community decision. [34]

Reception

In November 2014, Wikidata received the Open Data Publisher Award from the Open Data Institute “for sheer scale, and built-in openness”. [35]

As of November 2018, Wikidata information is used in 58.4% of all English Wikipedia articles, mostly for external identifiers or coordinate locations. In aggregate, data from Wikidata is shown in 64% of all Wikipedias' pages, 93% of all Wikivoyage articles, 34% of all Wikiquotes', 32% of all Wikisources', and 27% of Wikimedia Commons'. Usage in other Wikimedia Foundation projects is testimonial. [36]

As of November 2018, Wikidata's data is visualized by at least 20 other external tools [37] and at least 100 papers have been published about Wikidata. [38] Its importance has been recognized by numerous cultural institutions. [39]

Applications

See also

Related Research Articles

Interwiki links

Interwiki linking (W-link) is a facility for creating links to the many wikis on the World Wide Web. Users avoid pasting in entire URLs and instead use a shorthand similar to links within the same wiki.

The Semantic Web is an extension of the World Wide Web through standards set by the World Wide Web Consortium (W3C). The goal of the Semantic Web is to make Internet data machine-readable.

The Resource Description Framework (RDF) is a family of World Wide Web Consortium (W3C) specifications originally designed as a metadata data model. It has come to be used as a general method for conceptual description or modeling of information that is implemented in web resources, using a variety of syntax notations and data serialization formats. It is also used in knowledge management applications.

RDF Schema is a set of classes with certain properties using the RDF extensible knowledge representation data model, providing basic elements for the description of ontologies. It uses various forms of RDF vocabularies, intended to structure RDF resources. RDF and RDFS can be saved in a triplestore, then one can entail some knowledge from them using a query language, like SPARQL.

SPARQL is an RDF query language—that is, a semantic query language for databases—able to retrieve and manipulate data stored in Resource Description Framework (RDF) format. It was made a standard by the RDF Data Access Working Group (DAWG) of the World Wide Web Consortium, and is recognized as one of the key technologies of the semantic web. On 15 January 2008, SPARQL 1.0 was acknowledged by W3C as an official recommendation, and SPARQL 1.1 in March, 2013.

A semantic wiki is a wiki that has an underlying model of the knowledge described in its pages. Regular, or syntactic, wikis have structured text and untyped hyperlinks. Semantic wikis, on the other hand, provide the ability to capture or identify information about the data within pages, and the relationships between pages, in ways that can be queried or exported like a database through semantic queries.

Semantic MediaWiki

Semantic MediaWiki (SMW) is an extension to MediaWiki that allows for annotating semantic data within wiki pages, thus turning a wiki that incorporates the extension into a semantic wiki. Data that has been encoded can be used in semantic searches, used for aggregation of pages, displayed in formats like maps, calendars and graphs, and exported to the outside world via formats like RDF and CSV.

Semantic integration is the process of interrelating information from diverse sources, for example calendars and to do lists, email archives, presence information, documents of all sorts, contacts, search results, and advertising and marketing relevance derived from them. In this regard, semantics focuses on the organization of and action upon information by acting as an intermediary between heterogeneous data sources, which may conflict not only by structure but also context or value.

An RDF query language is a computer language, specifically a query language for databases, able to retrieve and manipulate data stored in Resource Description Framework (RDF) format.

DBpedia Online database project

DBpedia is a project aiming to extract structured content from the information created in the Wikipedia project. This structured information is made available on the World Wide Web. DBpedia allows users to semantically query relationships and properties of Wikipedia resources, including links to other related datasets. In 2008, Tim Berners-Lee described DBpedia as one of the most famous parts of the decentralized Linked Data effort.

History of wikis

The history of wikis begins in 1994, when Ward Cunningham gave the name "WikiWikiWeb" to the knowledge base, which ran on his company's website at c2.com, and the wiki software that powered it. The "wiki went public" in March 1995—the date used in anniversary celebrations of the wiki's origins. c2.com is thus the first true wiki, or a website with pages and links that can be easily edited via the browser, with a reliable version history for each page. He chose "WikiWikiWeb" as the name based on his memories of the "Wiki Wiki Shuttle" at Honolulu International Airport, and because "wiki" is the Hawaiian word for "quick".

A triplestore or RDF store is a purpose-built database for the storage and retrieval of triples through semantic queries. A triple is a data entity composed of subject-predicate-object, like "Bob is 35" or "Bob knows Fred".

Freebase was a large collaborative knowledge base consisting of data composed mainly by its community members. It was an online collection of structured data harvested from many sources, including individual, user-submitted wiki contributions. Freebase aimed to create a global resource that allowed people to access common information more effectively. It was developed by the American software company Metaweb and ran publicly beginning in March 2007. Metaweb was acquired by Google in a private sale announced 16 July 2010. Google's Knowledge Graph was powered in part by Freebase.

XQuery is a query and functional programming language that queries and transforms collections of structured and unstructured data, usually in the form of XML, text and with vendor-specific extensions for other data formats. The language is developed by the XML Query working group of the W3C. The work is closely coordinated with the development of XSLT by the XSL Working Group; the two groups share responsibility for XPath, which is a subset of XQuery.

In computing, a graph database (GDB) is a database that uses graph structures for semantic queries with nodes, edges, and properties to represent and store data. A key concept of the system is the graph. The graph relates the data items in the store to a collection of nodes and edges, the edges representing the relationships between the nodes. The relationships allow data in the store to be linked together directly and, in many cases, retrieved with one operation. Graph databases hold the relationships between data as a priority. Querying relationships is fast because they are perpetually stored in the database. Relationships can be intuitively visualized using graph databases, making them useful for heavily inter-connected data.

Cypher is a declarative graph query language that allows for expressive and efficient data querying in a property graph.

Wikibase Collection of software (applications and libraries) for creating, managing and sharing structured data

Wikibase is a set of MediaWiki extensions for working with versioned semi-structured data in a central repository based upon JSON instead of the unstructured data of MediaWiki wikitext. Its primary components are the Wikibase Repository, an extension for storing and managing data, and the Wikibase Client which allows for the retrieval and embedding of structured data from a wikibase repository. Wikibase was developed for and is used by Wikidata.

Semantic queries allow for queries and analytics of associative and contextual nature. Semantic queries enable the retrieval of both explicitly and implicitly derived information based on syntactic, semantic and structural information contained in data. They are designed to deliver precise results or to answer more fuzzy and wide open questions through pattern matching and digital reasoning.

Blazegraph

Blazegraph is a triplestore and graph database, which is used in the Wikidata SPARQL endpoint.

References

  1. https://blog.wikimedia.org/2013/04/25/the-wikidata-revolution/; retrieved: 14 November 2018; quotation: Since Wikidata.org went live on 30 October 2012,.
  2. 1 2 Wikidata (Archived October 30, 2012, at WebCite )
  3. "Data Revolution for Wikipedia". Wikimedia Deutschland. March 30, 2012. Archived from the original on September 11, 2012. Retrieved September 11, 2012.
  4. "Wikibase — Home".
  5. "Help:Statements".
  6. "Help:Data type".
  7. "Wikidata:Database reports/List of properties/Top100".
  8. "Help:Property constraints portal".
  9. "Help:Sources".
  10. "Wikidata - Lexicographical data documentation".
  11. Dickinson, Boonsri (March 30, 2012). "Paul Allen Invests In A Massive Project To Make Wikipedia Better". Business Insider. Retrieved September 11, 2012.
  12. Perez, Sarah (March 30, 2012). "Wikipedia's Next Big Thing: Wikidata, A Machine-Readable, User-Editable Database Funded By Google, Paul Allen And Others". TechCrunch. Archived from the original on September 11, 2012. Retrieved September 11, 2012.
  13. "Wikidata - Meta".
  14. Pintscher, Lydia (October 30, 2012). "wikidata.org is live (with some caveats)". wikidata-l (Mailing list). Retrieved November 3, 2012.
  15. Roth, Matthew (March 30, 2012). "The Wikipedia data revolution". Wikimedia Foundation. Archived from the original on September 11, 2012. Retrieved September 11, 2012.
  16. Pintscher, Lydia (14 January 2013). "First steps of Wikidata in the Hungarian Wikipedia". Wikimedia Deutschland. Retrieved 17 December 2015.
  17. Pintscher, Lydia (2013-01-30). "Wikidata coming to the next two Wikipedias". Wikimedia Deutschland. Retrieved January 31, 2013.
  18. Pintscher, Lydia (13 February 2013). "Wikidata live on the English Wikipedia". Wikimedia Deutschland. Retrieved 15 February 2013.
  19. Pintscher, Lydia (6 March 2013). "Wikidata now live on all Wikipedias". Wikimedia Deutschland. Retrieved 8 March 2013.
  20. "Wikidata ist für alle Wikipedien da" (in German). Golem.de. Retrieved 29 January 2014.
  21. "Wikipedia talk:Wikidata interwiki RFC". March 29, 2013. Retrieved March 30, 2013.
  22. Pintscher, Lydia (23 September 2013). "Wikidata is Here!". Commons:Village pump.
  23. Pintscher, Lydia. "Wikidata/Status updates/2013 03 01". Wikimedia Meta-Wiki. Wikimedia Foundation. Retrieved 3 March 2013.
  24. Pintscher, Lydia (27 March 2013). "You can have all the data!". Wikimedia Deutschland. Retrieved 28 March 2013.
  25. "Wikidata goes live worldwide". The H. 2013-04-25. Archived from the original on 1 January 2014.
  26. Lydia, Pintscher (16 September 2015). "Wikidata: Access to data from arbitrary items is here". Wikipedia:Village pump (technical) . Retrieved 30 August 2016.
  27. Lydia, Pintscher (27 April 2016). "Wikidata support: arbitrary access is here". Commons:Village pump . Retrieved 30 August 2016.
  28. https://query.wikidata.org/
  29. "Announcing the release of the Wikidata Query Service".
  30. "Wikidata Query Data tools".
  31. "Most Popular Island, Plane Crash Movies and TV Shows". IMDb.
  32. "Feature Film, with Johnny Depp, Leonardo DiCaprio (Sorted by Popularity Ascending)". IMDb.
  33. commons:File talk:Wikidata-logo-en.svg#Hybrid. Retrieved 2016-10-06.
  34. https://blog.wikimedia.de/2012/07/13/und-der-gewinner-ist/
  35. "First ODI Open Data Awards presented by Sirs Tim Berners-Lee and Nigel Shadbolt". Archived from the original on 2016-03-24.
  36. "Percentage of articles making use of data from Wikidata".
  37. "Wikidata Tools - Visualize data".
  38. "Scholia - Wikidata".
  39. "International Semantic Web Conference 2018".
  40. Rob Barry / Mwnci - Deep Spreadsheets · GitLab
  41. "Public Review Issues".
  42. Wiki Explorer in the Google Play Store
  43. Krause, Volker, KDE Itinerary - A privacy by design travel assistant , retrieved 2020-11-10

Further reading