OpenXDF

Last updated
OpenXDF
Filename extension
.xdf
Uniform Type Identifier (UTI) org.openxdf.xdf
Latest release
1.2
(June 30, 2009)
Type of format Electroencephalography and Polysomnography

The Open eXchange Data Format, or OpenXDF, is an open, XML-based standard for the digital storage and exchange of time-series physiological signals and metadata. OpenXDF primarily focuses on electroencephalography and polysomnography.

Contents

History

Neurotronics began work on OpenXDF in 2003 with the goal of providing a modern, open, and extensible file format with which clinicians and researchers can share physiological data and metadata, such as signal data, signal montages, patient demographics, and event logs.[ citation needed ]

Neurotronics released the first draft of the OpenXDF Specification just before the 18th meeting of the Associated Professional Sleep Societies in 2004. Neurotronics has since relinquished control of the format to the OpenXDF Consortium.

As of version 1.0, OpenXDF is 100% backward compatible with the European Data Format (EDF), the current de facto standard format for physiological data exchange.

Features

Tiered structure

OpenXDF is a tiered framework designed to allow standardized and custom specializations of the format while enforcing a common foundation that provides a high-level of compatibility between unrelated systems.

Metadata

OpenXDF expands on EDF by providing standardized support for extensive patient information, display montages, annotations, and scoring information.

Unicode support

OpenXDF requires the use of a XML 1.0 compliant parser that supports UTF-8 and UTF-16.

Signal configuration

OpenXDF supports fully and independently configurable data signals. Each signal specifies its byte order, whether its samples are signed, the size of its samples, and its sampling rate.

Security

OpenXDF supports encryption of the XML file using TwoFish in Cipher Feedback (CFB) mode with a 256-bit key created from a UTF-8 encoded password hashed with SHA-256. In addition, OpenXDF supports integrity verification using a SHA-512 hash of the original XML file.

See also

Related Research Articles

Portable Document Format (PDF) is a file format developed by Adobe in 1993 to present documents, including text formatting and images, in a manner independent of application software, hardware, and operating systems. Based on the PostScript language, each PDF file encapsulates a complete description of a fixed-layout flat document, including the text, fonts, vector graphics, raster images and other information needed to display it.

Digital Picture Exchange (DPX) is a common file format for digital intermediate and visual effects work and is a SMPTE standard. The file format is most commonly used to represent the density of each colour channel of a scanned negative film in an uncompressed "logarithmic" image where the gamma of the original camera negative is preserved as taken by a film scanner. For this reason, DPX is the worldwide-chosen format for still frames storage in most digital intermediate post-production facilities and film labs. Other common video formats are supported as well, from video to purely digital ones, making DPX a file format suitable for almost any raster digital imaging applications. DPX provides, in fact, a great deal of flexibility in storing colour information, colour spaces and colour planes for exchange between production facilities. Multiple forms of packing and alignment are possible. The DPX specification allows for a wide variety of metadata to further clarify information stored within each file.

The Advanced Authoring Format (AAF) is a file format for professional cross-platform data interchange, designed for the video post-production and authoring environment. It was created by the Advanced Media Workflow Association (AMWA), and is now being standardized through the Society of Motion Picture and Television Engineers (SMPTE).

MARCstandards are a set of digital formats for the description of items catalogued by libraries, such as books. Working with the Library of Congress, American computer scientist Henriette Avram developed MARC in the 1960s to create records that could be read by computers and shared among libraries. By 1971, MARC formats had become the US national standard for dissemination of bibliographic data. Two years later, they became the international standard. There are several versions of MARC in use around the world, the most predominant being MARC 21, created in 1999 as a result of the harmonization of U.S. and Canadian MARC formats, and UNIMARC. UNIMARC is maintained by the Permanent UNIMARC Committee of the International Federation of Library Associations and Institutions (IFLA), and is widely used in Europe. The MARC 21 family of standards now includes formats for authority records, holdings records, classification schedules, and community information, in addition to the format for bibliographic records.

The International Press Telecommunications Council (IPTC), based in London, United Kingdom, is a consortium of the world's major news agencies, other news providers and news industry vendors and acts as the global standards body of the news media.

The Darwin Information Typing Architecture (DITA) specification defines a set of document types for authoring and organizing topic-oriented information, as well as a set of mechanisms for combining, extending, and constraining document types. It is an open standard that is defined and maintained by the OASIS DITA Technical Committee.

Magnet URI scheme

Magnet is a URI scheme that defines the format of magnet links, a de facto standard for identifying files (URN) by their content, via cryptographic hash value rather than by their location.

Learning object metadata Data model

Learning Object Metadata is a data model, usually encoded in XML, used to describe a learning object and similar digital resources used to support learning. The purpose of learning object metadata is to support the reusability of learning objects, to aid discoverability, and to facilitate their interoperability, usually in the context of online learning management systems (LMS).

Extensible Metadata Platform ISO standard

The Extensible Metadata Platform (XMP) is an ISO standard, originally created by Adobe Systems Inc., for the creation, processing and interchange of standardized and custom metadata for digital documents and data sets.

JT is an ISO-standardized 3D data format and is in industry used for product visualization, collaboration, CAD data exchange, and in some also for long-term data retention. It can contain any combination of approximate (faceted) data, boundary representation surfaces (NURBS), Product and Manufacturing Information (PMI), and Metadata either exported from the native CAD system or inserted by a product data management (PDM) system.

Office Open XML is a zipped, XML-based file format developed by Microsoft for representing spreadsheets, charts, presentations and word processing documents. The format was initially standardized by Ecma, and by the ISO and IEC in later versions.

This article describes the technical specifications of the OpenDocument office document standard, as developed by the OASIS industry consortium. A variety of organizations developed the standard publicly and make it publicly accessible, meaning it can be implemented by anyone without restriction. The OpenDocument format aims to provide an open alternative to proprietary document formats.

European Data Format (EDF) is a standard file format designed for exchange and storage of medical time series. Being an open and non-proprietary format, EDF(+) is commonly used to archive, exchange and analyse data from commercial devices in a format that is independent of the acquisition system. In this way, the data can be retrieved and analyzed by independent software. EDF(+) software and example files are freely available.

Metalink

Metalink is an extensible metadata file format that describes one or more computer files available for download. It specifies files appropriate for the user's language and operating system; facilitates file verification and recovery from data corruption; and lists alternate download sources.

Geospatial metadata is a type of metadata applicable to geographic data and information. Such objects may be stored in a geographic information system (GIS) or may simply be documents, data-sets, images or other objects, services, or related items that exist in some other native environment but whose features may be appropriate to describe in a (geographic) metadata catalog.

The Open Packaging Conventions (OPC) is a container-file technology initially created by Microsoft to store a combination of XML and non-XML files that together form a single entity such as an Open XML Paper Specification (OpenXPS) document. OPC-based file formats combine the advantages of leaving the independent file entities embedded in the document intact and resulting in much smaller files compared to normal use of XML.

EPUB E-book file format

EPUB is an e-book file format that uses the ".epub" file extension. The term is short for electronic publication and is sometimes styled ePub. EPUB is supported by many e-readers, and compatible software is available for most smartphones, tablets, and computers. EPUB is a technical standard published by the International Digital Publishing Forum (IDPF). It became an official standard of the IDPF in September 2007, superseding the older Open eBook standard.

In the BitTorrent file distribution system, a torrent file or meta-info file is a computer file that contains metadata about files and folders to be distributed, and usually also a list of the network locations of trackers, which are computers that help participants in the system find each other and form efficient distribution groups called swarms. A torrent file does not contain the content to be distributed; it only contains information about those files, such as their names, folder structure, and sizes obtained via cryptographic hash values for verifying file integrity. The term torrent may refer either to the metadata file or to the files downloaded, depending on the context.

In computing, a data definition specification (DDS) is a guideline to ensure comprehensive and consistent data definition. It represents the attributes required to quantify data definition. A comprehensive data definition specification encompasses enterprise data, the hierarchy of data management, prescribed guidance enforcement and criteria to determine compliance.

References