Data dissemination

Last updated

Data dissemination is the distribution or transmitting of statistical, or other, data to end users. [1] [2] There are many ways organizations can release data to the public, i.e. electronic format, CD-ROM and paper publications such as PDF files based on aggregated data. The most popular dissemination method today is the ‘non-proprietary’ open systems using internet protocols. Data is made available in common open formats.

Contents

Some organizations choose to disseminate data using ‘proprietarydatabases in order to protect their sovereignty and copyright of the data. Proprietary data dissemination requires a specific piece of software in order for end users to view the data. The data will not open in common open formats. The data is first converted into the proprietary data format, and specifically designed software is provided by the organization to users.

Dissemination formats and standards

Under the Special Data Dissemination Standard, the formats are divided into two categories: "hardcopy" and "electronic" publications

Some examples of Hardcopy publications:

Some examples of electronic copy publications:

Standards

Standards have been developed in order to provide an internationally accepted statistical methodology for the dissemination of statistical data. The ‘International Organization for Standardization’ (ISO) are one such international standard-setting body made up of representatives from various national standards organizations. They created the SDMX standard widely used around the world. SDMX stands for ‘Statistical Data and Metadata Exchange’. [3] It is commonly used in national and international statistical and economic data sharing systems. This standard is for the exchange of essential social and economic statistics, for example between European national agencies and Eurostat and the European Central Bank. SDMX is used for the dissemination of multi-dimensional aggregated data.

The Data Documentation Initiative (DDI) was created by the DDI Alliance. DDI is an open metadata specification and covers the full data life cycle from planning through to dissemination and archiving data. It is most popularly used for social statistics micro data but is not limited to this subject area. There are some examples online where these two standards are in use in proprietary data form.

Some examples of proprietary data dissemination online

Related Research Articles

<span class="mw-page-title-main">SPSS</span> Statistical analysis software

SPSS Statistics is a statistical software suite developed by IBM for data management, advanced analytics, multivariate analysis, business intelligence, and criminal investigation. Long produced by SPSS Inc., it was acquired by IBM in 2009. Versions of the software released since 2015 have the brand name IBM SPSS Statistics.

Electronic publishing includes the digital publication of e-books, digital magazines, and the development of digital libraries and catalogues. It also includes the editing of books, journals, and magazines to be posted on a screen.

<span class="mw-page-title-main">Digital object identifier</span> ISO standard unique string identifier for a digital object

A digital object identifier (DOI) is a persistent identifier or handle used to uniquely identify various objects, standardized by the International Organization for Standardization (ISO). DOIs are an implementation of the Handle System; they also fit within the URI system. They are widely used to identify academic, professional, and government information, such as journal articles, research reports, data sets, and official publications. DOIs have also been used to identify other types of information resources, like commercial videos.

A container format or metafile is a file format that allows multiple data streams to be embedded into a single file, usually along with metadata for identifying and further detailing those streams. Notable examples of container formats include archive files and formats used for multimedia playback. Among the earliest cross-platform container formats were Distinguished Encoding Rules and the 1985 Interchange File Format.

A metadata registry is a central location in an organization where metadata definitions are stored and maintained in a controlled method.

The Clinical Data Interchange Standards Consortium (CDISC) is a standards developing organization (SDO) dealing with medical research data linked with healthcare, to "enable information system interoperability to improve medical research and related areas of healthcare". The standards support medical research from protocol through analysis and reporting of results and have been shown to decrease resources needed by 60% overall and 70–90% in the start-up stages when they are implemented at the beginning of the research process.

Agricultural Information Management Standards (AIMS) is a web site managed by the Food and Agriculture Organization of the United Nations (FAO) for accessing and discussing agricultural information management standards, tools and methodologies connecting information workers worldwide to build a global community of practice. Information management standards, tools and good practices can be found on AIMS:

The IMF International Financial Statistics (IFS) is a compilation of financial data collected from various sources, covering the economies of 194 countries and areas worldwide, which is published monthly by the International Monetary Fund (IMF).

Geospatial metadata is a type of metadata applicable to geographic data and information. Such objects may be stored in a geographic information system (GIS) or may simply be documents, data-sets, images or other objects, services, or related items that exist in some other native environment but whose features may be appropriate to describe in a (geographic) metadata catalog.

GESMES/TS is a data model and message format appropriate for performing standardised exchange of statistical data and related metadata. It is based on the GESMES message . Its most common use is in the exchange of official statistics.

SDMX, which stands for Statistical Data and Metadata eXchange, is an international initiative that aims at standardising and modernising ("industrialising") the mechanisms and processes for the exchange of statistical data and metadata among international organisations and their member countries.

<span class="mw-page-title-main">Official statistics</span> Statistics published by government agencies

Official statistics are statistics published by government agencies or other public bodies such as international organizations as a public good. They provide quantitative or qualitative information on all major areas of citizens' lives, such as economic and social development, living conditions, health, education, and the environment.

A file format is a standard way that information is encoded for storage in a computer file. It specifies how bits are used to encode information in a digital storage medium. File formats may be either proprietary or free.

<span class="mw-page-title-main">Metadata</span> Data about data

Metadata is "data that provides information about other data", but not the content of the data itself, such as the text of a message or the image itself. There are many distinct types of metadata, including:

DevInfo was a database system developed under the auspices of the United Nations and endorsed by the United Nations Development Group for monitoring human development with the specific purpose of monitoring the Millennium Development Goals (MDGs), which is a set of Human Development Indicators. DevInfo was a tool for organizing, storing and presenting data in a uniform way to facilitate data sharing at the country level across government departments, UN agencies and development partners. It was distributed royalty-free to all UN member states. It was a further development of the earlier UNICEF database system ChildInfo.

<span class="mw-page-title-main">Google Public Data Explorer</span> Service by Google

Google Public Data Explorer provides public data and forecasts from a range of international organizations and academic institutions including the World Bank, OECD, Eurostat and the University of Denver. These can be displayed as line graphs, bar graphs, cross sectional plots or on maps. The product was launched on March 8, 2010 as an experimental visualization tool in Google Labs.

<span class="mw-page-title-main">Colectica</span> Statistics software suite

Colectica is a suite of programs for use in documenting official statistics and specifying statistical surveys using open standards that enable researchers, archivists, and programmers to perform:

The National Documentation Centre is a Greek public organisation that promotes knowledge, research, innovation and digital transformation. It was established in 1980 with funding from the United Nations Development Programme with the aim to strengthen the collection and distribution of research-related material, and to ensure full accessibility to it. It has been designated as a National Scientific Infrastructure, a National Authority of the Hellenic Statistical System, and National Contact Point for European Research and Innovation Programmes. Since August 2019, it has been established as a discrete public-interest legal entity under private law, and is supervised by the Ministry of Digital Governance. The management bodies of EKT are the Administrative Board and the Director who, since 2013, has been Dr. Evi Sachini.

Nesstar was a suite of data and metadata management software created in 2000 and maintained by the former Norwegian Social Science Data Services until its end-of-life in 2022. The Nesstar tool suite consisted of a Nesstar Repository, Nesstar WebView, a Nesstar Editor, and the Nesstar Explorer as the user interface.

References

  1. "OECD Glossary of Statistical Terms - Data dissemination Definition". stats.oecd.org. Retrieved 2021-04-09.
  2. "What is Data Dissemination | Online Learning" . Retrieved 2021-04-09.
  3. "What is SDMX?" . Retrieved Apr 9, 2021.