TransducerML

Last updated

TransducerML (Transducer Markup Language) or TML is a retired Open Geospatial Consortium standard developed to describe any transducer (sensor or transmitter) in terms of a common model, including characterizing not only the data but XML formed metadata describing the system producing that data.

Open Geospatial Consortium standards organization

The Open Geospatial Consortium (OGC), an international voluntary consensus standards organization, originated in 1994. In the OGC, more than 500 commercial, governmental, nonprofit and research organizations worldwide collaborate in a consensus process encouraging development and implementation of open standards for geospatial content and services, sensor web and Internet of Things, GIS data processing and data sharing.

A transducer is a device that converts energy from one form to another. Usually a transducer converts a signal in one form of energy to a signal in another.

Sensor converter that measures a physical quantity and converts it into a signal

In the broadest definition, a sensor is a device, module, or subsystem whose purpose is to detect events or changes in its environment and send the information to other electronics, frequently a computer processor. A sensor is always used with other electronics.

Contents

Process

TML captures when and where a sensor measurement or transmitter actuation occurs. Its system description describes not only individual data sources but also systems of components, including the specific types of components, the logical and physical relationships between the components, and the data produced or consumed by each of the components. Information captured includes manufacturer information, model numbers of specific items, serial numbers, how two devices may relate to each other both logically and physically (for example, a GPS system may provide location information for a camera and the GPS antenna may be located a certain distance away from the camera center), and the type of data being produced from those particular devices. Time stamps for each data measurement and other identifying information is also captured, making the TML system description particularly well suited for carrying data required for automated system discovery and to support data retrieval.

Global Positioning System American satellite navigation system

The Global Positioning System (GPS), originally Navstar GPS, is a satellite-based radionavigation system owned by the United States government and operated by the United States Air Force. It is a global navigation satellite system that provides geolocation and time information to a GPS receiver anywhere on or near the Earth where there is an unobstructed line of sight to four or more GPS satellites. Obstacles such as mountains and buildings block the relatively weak GPS signals.

Metadata relating to archiving, indexing and cataloguing is an integral part of TML, since a TML data stream is designed to be self-contained and self-sufficient. Any information about the system, as well as information required to later parse and process the data, is captured in the TML system description. In addition to information about the system that produced the data, precise information about the data itself is captured. Data types, data sizes, ordering and arrangement, calibration information, units of measurement, precise time-tagging of individual groups of data, information about uncertainty, coordinate reference frames (where applicable) and physical phenomena relating to the data are among the details which are captured and retained. The TML system description therefore automatically tags all fields, which can later be stored in a registry for discovery.

TML system description fields include descriptions of the physical system, the data system and the data product. The data itself forms the other component of a TML data stream. The physical system description includes information such as model and serial number information about specific transducers and components of a system, system calibration information, system capabilities, installation information, owners and operators, and other information directly applicable to searches related to general data exchange independent of operating conditions. The data system description contains information about the specific transducers and components such as their behavior, responses to physical phenomena, sensitivity, and other operating parameters. The data product description addresses the specific data stream, such as data types, layouts, encoding, and other information necessary for the consumer of a TML data stream to interpret the stream.

Uses

Using TML metadata enables a common metadata archive to be developed, which then permits discovery, search and retrieval based on a common technique. Regardless of the source of the data and its native complexity, metadata about the data generation system is readily at hand, and can be searched to discover specific systems of interest based on a number of criteria

A key benefit to TML is that it enables correlation of measurements temporally by using a high-resolution clock tied to each individual data source, and models logical and physical relationships between multiple transducers in a system. Data from all elements of a system are integrated into a real-time data stream to substantially reduce the time required for processing and representation of that data, whether it pertains to metadata or to the primary data itself.

Another key benefit to TML is that by bringing both data and metadata from multiple time-varying sources of data into a single stream in a common format, data and metadata archiving, retrieval, analysis and processing can be more easily performed across disparate hardware and software systems. The time tagging of both the data and metadata allows precise determination of the state of a system, and therefore whether its data is of interest, regardless of whether that system remains static or has elements removed, replaced or added. This permits searching for data at a finer granularity than previously possible, while still supporting higher-level data discovery if a user so desires, since the use of individual fields within a TML system description is optional.

TML can process data from simple stationary in-situ transducers to high bandwidth dynamic remote devices such as a synthetic aperture radar system.

See also

IEEE 1451 is a set of smart transducer interface standards developed by the Institute of Electrical and Electronics Engineers (IEEE) Instrumentation and Measurement Society’s Sensor Technology Technical Committee describing a set of open, common, network-independent communication interfaces for connecting transducers to microprocessors, instrumentation systems, and control/field networks. One of the key elements of these standards is the definition of Transducer electronic data sheets (TEDS) for each transducer. The TEDS is a memory device attached to the transducer, which stores transducer identification, calibration, correction data, and manufacturer-related information. The goal of the IEEE 1451 family of standards is to allow the access of transducer data through a common set of interfaces whether the transducers are connected to systems or networks via a wired or wireless means.

Observations and Measurements, shortly known as O&M, is an international standard which defines a conceptual schema encoding for observations, and for features involved in sampling when making observations. While the O&M standard was developed in the context of geographic information systems, the model is derived from generic patterns proposed by Fowler and Odell, and is not limited to spatial information. O&M is one of the core standards in the OGC Sensor Web Enablement suite, providing the response model for Sensor Observation Service (SOS).

Related Research Articles

Precision is a description of random errors, a measure of statistical variability.

Calibration in measurement technology and metrology is the comparison of measurement values delivered by a device under test with those of a calibration standard of known accuracy. Such a standard could be another measurement device of known accuracy, a device generating the quantity to be measured such as a voltage, sound tone, or a physical artefact, such as a metre ruler.

Datasheet Technical document summarizing performance and constraints of system components

A datasheet, data sheet, or spec sheet is a document that summarizes the performance and other technical characteristics of a product, machine, component, material, a subsystem or software in sufficient detail that allows design engineer to understand the role of the component in the overall system. Typically, a datasheet is created by the manufacturer and begins with an introductory page describing the rest of the document, followed by listings of specific characteristics, with further information on the connectivity of the devices. In cases where there is relevant source code to include, it is usually attached near the end of the document or separated into another file.

A modeling language is any artificial language that can be used to express information or knowledge or systems in a structure that is defined by a consistent set of rules. The rules are used for interpretation of the meaning of components in the structure.

Extensible Metadata Platform ISO standard

The Extensible Metadata Platform (XMP) is an ISO standard, originally created by Adobe Systems Inc., for the creation, processing and interchange of standardized and custom metadata for digital documents and data sets.

A metadata registry is a central location in an organization where metadata definitions are stored and maintained in a controlled method.

A configuration management database (CMDB) is a database used by an organization to store information about hardware and software assets. This database acts as a data warehouse for the organization and also stores information regarding the relationship between its assets. The CMDB provides a means of understanding the organization's critical assets and their relationships, such as information systems, upstream sources or dependencies of assets, and the downstream targets of assets.

SensorML is an approved Open Geospatial Consortium standard. SensorML provides standard models and an XML encoding for describing sensors and measurement processes. SensorML can be used to describe a wide range of sensors, including both dynamic and stationary platforms and both in-situ and remote sensors.

SODAR meteorological instrumentation

SODAR, also written as sodar, is a meteorological instrument used as a wind profiler to measure the scattering of sound waves by atmospheric turbulence. SODAR systems are used to measure wind speed at various heights above the ground, and the thermodynamic structure of the lower layer of the atmosphere.

The AgMES initiative was developed by the Food and Agriculture Organization (FAO) of the United Nations and aims to encompass issues of semantic standards in the domain of agriculture with respect to description, resource discovery, interoperability and data exchange for different types of information resources.

Knowledge Discovery Metamodel (KDM) is a publicly available specification from the Object Management Group (OMG). KDM is a common intermediate representation for existing software systems and their operating environments, that defines common metadata required for deep semantic integration of Application Lifecycle Management tools. KDM was designed as the OMG's foundation for software modernization, IT portfolio management and software assurance. KDM uses OMG's Meta-Object Facility to define an XMI interchange format between tools that work with existing software as well as an abstract interface (API) for the next-generation assurance and modernization tools. KDM standardizes existing approaches to knowledge discovery in software engineering artifacts, also known as software mining.

Software mining is an application of knowledge discovery in the area of software modernization which involves understanding existing software artifacts. This process is related to a concept of reverse engineering. Usually the knowledge obtained from existing software is presented in the form of models to which specific queries can be made when necessary. An entity relationship is a frequent format of representing knowledge obtained from existing software. Object Management Group (OMG) developed specification Knowledge Discovery Metamodel (KDM) which defines an ontology for software assets and their relationships for the purpose of performing knowledge discovery of existing code.

SDTM defines a standard structure for human clinical trial (study) data tabulations and for nonclinical study data tabulations that are to be submitted as part of a product application to a regulatory authority such as the United States Food and Drug Administration (FDA). The Submission Data Standards team of Clinical Data Interchange Standards Consortium (CDISC) defines SDTM.

Metadata data about data

Metadata is "data [information] that provides information about other data". Many distinct types of metadata exist, among these descriptive metadata, structural metadata, administrative metadata, reference metadata and statistical metadata.

IDEF5

IDEF5 is a software engineering method to develop and maintain usable, accurate domain ontologies. This standard is part of the IDEF family of modeling languages in the field of software engineering.

The Publishing Requirements for Industry Standard Metadata (PRISM) specification defines a set of XML metadata vocabularies for syndicating, aggregating, post-processing and multi-purposing content. PRISM provides a framework for the interchange and preservation of content and metadata, a collection of elements to describe that content, and a set of controlled vocabularies listing the values for those elements. PRISM can be XML, RDF/XML, or XMP and incorporates Dublin Core elements. PRISM can be thought of as a set of XML tags used to contain the metadata of articles and even tag article content.

The Climate and Forecast (CF) metadata conventions are conventions for the description of Earth sciences data, intended to promote the processing and sharing of data files. The metadata defined by the CF conventions are generally included in the same file as the data, thus making the file "self-describing". The conventions provide a definitive description of what the data values found in each netCDF variable represent, and of the spatial and temporal properties of the data, including information about grids, such as grid cell bounds and cell averaging methods. This enables users of files from different sources to decide which variables are comparable, and is a basis for building software applications with powerful data extraction, grid remapping, data analysis, and data visualization capabilities.

A metadata repository is a database created to store metadata. Metadata is information about the structures that contain the actual data. Metadata is often said to be "data about data", but this is misleading. Data profiles are an example of actual "data about data". Metadata is one layer of abstraction removed from this – it is data about the structures that contain data. Metadata may describe the structure of any data, of any subject, stored in any format.

The Sensor Observation Service (SOS) is a web service to query real-time sensor data and sensor data time series and is part of the Sensor Web. The offered sensor data comprises descriptions of sensors themselves, which are encoded in the Sensor Model Language (SensorML), and the measured values in the Observations and Measurements encoding format. The web service as well as both file formats are open standards and specifications of the same name defined by the Open Geospatial Consortium (OGC).