IF-MAP

Last updated

The Interface for Metadata Access Points (IF-MAP) is an open specification for a client/server protocol developed by the Trusted Computing Group (TCG) as one of the core protocols of the Trusted Network Connect (TNC) open architecture.

Contents

IF-MAP provides a common interface between the Metadata Access Point (MAP), a database server acting as a clearinghouse for information about security events and objects, and other elements of the TNC architecture.

The IF-MAP protocol defines a publish/subscribe/search mechanism with a set of identifiers and data types.

History

The IF-MAP protocol was first published by the TCG on April 28, 2008. Originally, the IF-MAP specification was developed to support data sharing across various vendor’s devices and applications for network security. [1] The specification has also been adopted for additional use cases of data-sharing including physical security. [2]

The 2.0 version of the IF-MAP spec separated the base protocol from the metadata definitions that define how different types of information are represented. The goal in separating the base protocol from the metadata definitions within the specification was to allow the specification to be adopted across other technologies (such as cloud computing, [3] industrial control systems, [4] or smart grid) to leverage their existing data models within the MAP framework. [5]

Version 2.1 of the IF-MAP spec was published on May 7, 2012. The primary new feature of IF-MAP 2.1 is that the IF-MAP identifier space became extensible. [6]

A reference implementation is available under GPLv3 license on Google Code repository. [7]

Related Research Articles

<span class="mw-page-title-main">OSI model</span> Model of communication of seven abstraction layers

The Open Systems Interconnection model is a conceptual model from the International Organization for Standardization (ISO) that "provides a common basis for the coordination of standards development for the purpose of systems interconnection". In the OSI reference model, the communications between a computing system are split into seven different abstraction layers: Physical, Data Link, Network, Transport, Session, Presentation, and Application.

Trusted Computing (TC) is a technology developed and promoted by the Trusted Computing Group. The term is taken from the field of trusted systems and has a specialized meaning that is distinct from the field of confidential computing. With Trusted Computing, the computer will consistently behave in expected ways, and those behaviors will be enforced by computer hardware and software. Enforcing this behavior is achieved by loading the hardware with a unique encryption key that is inaccessible to the rest of the system and the owner.

A web service (WS) is either:

<span class="mw-page-title-main">Distributed Management Task Force</span> Organization

Distributed Management Task Force (DMTF) is a 501(c)(6) nonprofit industry standards organization that creates open manageability standards spanning diverse emerging and traditional IT infrastructures including cloud, virtualization, network, servers and storage. Member companies and alliance partners collaborate on standards to improve interoperable management of information technologies.

<span class="mw-page-title-main">Trusted Platform Module</span> Standard for secure cryptoprocessors

Trusted Platform Module is an international standard for a secure cryptoprocessor, a dedicated microcontroller designed to secure hardware through integrated cryptographic keys. The term can also refer to a chip conforming to the standard.

Trusted Network Connect (TNC) is an open architecture for Network Access Control, promulgated by the Trusted Network Connect Work Group (TNC-WG) of the Trusted Computing Group (TCG).

The Opal Storage Specification is a set of specifications for features of data storage devices that enhance their security. For example, it defines a way of encrypting the stored data so that an unauthorized person who gains possession of the device cannot see the data. That is, it is a specification for self-encrypting drives (SED).

The Handle System is the Corporation for National Research Initiatives's proprietary registry assigning persistent identifiers, or handles, to information resources, and for resolving "those handles into the information necessary to locate, access, and otherwise make use of the resources".

Sector/Sphere is an open source software suite for high-performance distributed data storage and processing. It can be broadly compared to Google's GFS and MapReduce technology. Sector is a distributed file system targeting data storage over a large number of commodity computers. Sphere is the programming architecture framework that supports in-storage parallel data processing for data stored in Sector. Sector/Sphere operates in a wide area network (WAN) setting.

<span class="mw-page-title-main">API</span> Software interface between computers and/or programs

An application programming interface (API) is a way for two or more computer programs to communicate with each other. It is a type of software interface, offering a service to other pieces of software. A document or standard that describes how to build or use such a connection or interface is called an API specification. A computer system that meets this standard is said to implement or expose an API. The term API may refer either to the specification or to the implementation.

The Cloud Data Management Interface (CDMI) is a SNIA standard that specifies a protocol for self-provisioning, administering and accessing cloud storage.

Content delivery network interconnection (CDNI) is a set of interfaces and mechanisms required for interconnecting two independent content delivery networks (CDNs) that enables one to deliver content on behalf of the other. Interconnected CDNs offer many benefits, such as footprint extension, reduced infrastructure costs, higher availability, etc., for content service providers (CSPs), CDNs, and end users. Among its many use cases, it allows small CDNs to interconnect and provides services for CSPs that allows them to compete against the CDNs of global CSPs.

Cloud Infrastructure Management Interface (CIMI) is an open standard API specification for managing cloud infrastructure.

Google Cloud Platform (GCP), offered by Google, is a suite of cloud computing services that runs on the same infrastructure that Google uses internally for its end-user products, such as Google Search, Gmail, Google Drive, and YouTube. Alongside a set of management tools, it provides a series of modular cloud services including computing, data storage, data analytics and machine learning. Registration requires a credit card or bank account details.

Storage security is a specialty area of security that is concerned with securing data storage systems and ecosystems and the data that resides on these systems.

The High-performance Integrated Virtual Environment (HIVE) is a distributed computing environment used for healthcare-IT and biological research, including analysis of Next Generation Sequencing (NGS) data, preclinical, clinical and post market data, adverse events, metagenomic data, etc. Currently it is supported and continuously developed by US Food and Drug Administration, George Washington University, and by DNA-HIVE, WHISE-Global and Embleema. HIVE currently operates fully functionally within the US FDA supporting wide variety (+60) of regulatory research and regulatory review projects as well as for supporting MDEpiNet medical device postmarket registries. Academic deployments of HIVE are used for research activities and publications in NGS analytics, cancer research, microbiome research and in educational programs for students at GWU. Commercial enterprises use HIVE for oncology, microbiology, vaccine manufacturing, gene editing, healthcare-IT, harmonization of real-world data, in preclinical research and clinical studies.

Serverless computing is a cloud computing execution model in which the cloud provider allocates machine resources on demand, taking care of the servers on behalf of their customers. "Serverless" is a misnomer in the sense that servers are still used by cloud service providers to execute code for developers. However, developers of serverless applications are not concerned with capacity planning, configuration, management, maintenance, fault tolerance, or scaling of containers, VMs, or physical servers. Serverless computing does not hold resources in volatile memory; computing is rather done in short bursts with the results persisted to storage. When an app is not in use, there are no computing resources allocated to the app. Pricing is based on the actual amount of resources consumed by an application. It can be a form of utility computing.

Justin Cappos is a computer scientist and cybersecurity expert whose data-security software has been adopted by a number of widely used open-source projects. His research centers on software update systems, security, and virtualization, with a focus on real-world security problems.

References

  1. "NAC group expands its scope". Archived from the original on 2011-06-13. Retrieved 2010-09-13.
  2. "Technology for Securing a "Seat" at the Executive Table".
  3. "Open Cloud Consortium » IF-MAP Based Intercloud Testbed in Planning". Archived from the original on 2010-04-24. Retrieved 2010-09-13.
  4. "Error".
  5. "Trusted Computing Group eyes cloud security framework". Archived from the original on 2010-12-06. Retrieved 2010-09-13.
  6. See IF-MAP 2.1 FAQ on trustedcomputinggroup.org
  7. omapd on google code