Interoperability is a characteristic of a product or system to work with other products or systems. [1] While the term was initially defined for information technology or systems engineering services to allow for information exchange, [2] a broader definition takes into account social, political, and organizational factors that impact system-to-system performance. [3]
Types of interoperability include syntactic interoperability, where two systems can communicate with each other, and cross-domain interoperability, where multiple organizations work together and exchange information.
If two or more systems use common data formats and communication protocols then they are capable of communicating with each other and they exhibit syntactic interoperability. XML and SQL are examples of common data formats and protocols. Low-level data formats also contribute to syntactic interoperability, ensuring that alphabetical characters are stored in the same ASCII or a Unicode format in all the communicating systems.
Beyond the ability of two or more computer systems to exchange information, semantic interoperability is the ability to automatically interpret the information exchanged meaningfully and accurately in order to produce useful results as defined by the end users of both systems. To achieve semantic interoperability, both sides must refer to a common information exchange reference model. The content of the information exchange requests are unambiguously defined: what is sent is the same as what is understood.
Cross-domain interoperability involves multiple social, organizational, political, legal entities working together for a common interest or information exchange. [4]
Interoperability implies exchanges between a range of products, or similar products from several different vendors, or even between past and future revisions of the same product. Interoperability may be developed post-facto , as a special measure between two products, while excluding the rest, by using open standards.[ further explanation needed ] When a vendor is forced to adapt its system to a dominant system that is not based on open standards, it is compatibility, not interoperability.[ citation needed ]
Open standards rely on a broadly consultative and inclusive group including representatives from vendors, academics and others holding a stake in the development that discusses and debate the technical and economic merits, demerits and feasibility of a proposed common protocol. After the doubts and reservations of all members are addressed, the resulting common document is endorsed as a common standard. This document may be subsequently released to the public, and henceforth becomes an open standard. It is usually published and is available freely or at a nominal cost to any and all comers, with no further encumbrances. Various vendors and individuals (even those who were not part of the original group) can use the standards document to make products that implement the common protocol defined in the standard and are thus interoperable by design, with no specific liability or advantage for customers for choosing one product over another on the basis of standardized features. The vendors' products compete on the quality of their implementation, user interface, ease of use, performance, price, and a host of other factors, while keeping the customer's data intact and transferable even if he chooses to switch to another competing product for business reasons.
This section possibly contains original research .(August 2016) |
Post facto interoperability may be the result of the absolute market dominance of a particular product in contravention of any applicable standards, or if any effective standards were not present at the time of that product's introduction. The vendor behind that product can then choose to ignore any forthcoming standards and not co-operate in any standardization process at all, using its near-monopoly to insist that its product sets the de facto standard by its very market dominance. This is not a problem if the product's implementation is open and minimally encumbered, but it may well be both closed and heavily encumbered (e.g. by patent claims). Because of the network effect, achieving interoperability with such a product is both critical for any other vendor if it wishes to remain relevant in the market, and difficult to accomplish because of lack of cooperation on equal terms with the original vendor, who may well see the new vendor as a potential competitor and threat. The newer implementations often rely on clean-room reverse engineering in the absence of technical data to achieve interoperability. The original vendors may provide such technical data to others, often in the name of encouraging competition, but such data is invariably encumbered, and may be of limited use. Availability of such data is not equivalent to an open standard, because:
Speaking from an e-government perspective, interoperability refers to the collaboration ability of cross-border services for citizens, businesses and public administrations. Exchanging data can be a challenge due to language barriers, different specifications of formats, varieties of categorizations and other hindrances.
If data is interpreted differently, collaboration is limited, takes longer and is inefficient. For instance, if a citizen of country A wants to purchase land in country B, the person will be asked to submit the proper address data. Address data in both countries include full name details, street name and number as well as a postal code. The order of the address details might vary. In the same language, it is not an obstacle to order the provided address data; but across language barriers, it becomes difficult. If the language uses a different writing system it is almost impossible if no translation tools are available.
Interoperability is used by researchers in the context of urban flood risk management. [5] Cities and urban areas worldwide are expanding, which creates complex spaces with many interactions between the environment, infrastructure and people. To address this complexity and manage water in urban areas appropriately, a system of systems approach to water and flood control is necessary. In this context, interoperability is important to facilitate system-of-systems thinking, and is defined as: "the ability of any water management system to redirect water and make use of other system(s) to maintain or enhance its performance function during water exceedance events." [6] By assessing the complex properties of urban infrastructure systems, particularly the interoperability between the drainage systems and other urban systems (e.g. infrastructure such as transport), it could be possible to expand the capacity of the overall system to manage flood water towards achieving improved urban flood resilience. [7]
Force interoperability is defined in NATO as the ability of the forces of two or more nations to train, exercise and operate effectively together in the execution of assigned missions and tasks. Additionally NATO defines interoperability more generally as the ability to act together coherently, effectively and efficiently to achieve Allied tactical, operational and strategic objectives. [8]
At the strategic level, interoperability is an enabler for coalition building. It facilitates meaningful contributions by coalition partners. At this level, interoperability issues center on harmonizing world views, strategies, doctrines, and force structures. Interoperability is an element of coalition willingness to work together over the long term to achieve and maintain shared interests against common threats. Interoperability at the operational and tactical levels is where strategic interoperability and technological interoperability come together to help allies shape the environment, manage crises, and win wars. The benefits of interoperability at the operational and tactical levels generally derive from the interchangeability of force elements and units. Technological interoperability reflects the interfaces between organizations and systems. It focuses on communications and computers but also involves the technical capabilities of systems and the resulting mission compatibility between the systems and data of coalition partners. At the technological level, the benefits of interoperability come primarily from their impacts at the operational and tactical levels in terms of enhancing flexibility. [9]
Because first responders need to be able to communicate during wide-scale emergencies, interoperability is an important issue for law enforcement, fire fighting, emergency medical services, and other public health and safety departments. It has been a major area of investment and research over the last 12 years. [10] [11] Widely disparate and incompatible hardware impedes the exchange of information between agencies. [12] Agencies' information systems such as computer-aided dispatch systems and records management systems functioned largely in isolation, in so-called information islands. Agencies tried to bridge this isolation with inefficient, stop-gap methods while large agencies began implementing limited interoperable systems. These approaches were inadequate and, in the US, the lack of interoperability in the public safety realm become evident during the 9/11 attacks [13] on the Pentagon and World Trade Center structures. Further evidence of a lack of interoperability surfaced when agencies tackled the aftermath of Hurricane Katrina.
In contrast to the overall national picture, some states, including Utah, have already made great strides forward. The Utah Highway Patrol and other departments in Utah have created a statewide data sharing network. [14]
The Commonwealth of Virginia is one of the leading states in the United States in improving interoperability. The Interoperability Coordinator leverages a regional structure to better allocate grant funding around the Commonwealth so that all areas have an opportunity to improve communications interoperability. Virginia's strategic plan for communications is updated yearly to include new initiatives for the Commonwealth – all projects and efforts are tied to this plan, which is aligned with the National Emergency Communications Plan, authored by the Department of Homeland Security's Office of Emergency Communications.
The State of Washington seeks to enhance interoperability statewide. The State Interoperability Executive Committee [15] (SIEC), established by the legislature in 2003, works to assist emergency responder agencies (police, fire, sheriff, medical, hazmat, etc.) at all levels of government (city, county, state, tribal, federal) to define interoperability for their local region. Washington recognizes that collaborating on system design and development for wireless radio systems enables emergency responder agencies to efficiently provide additional services, increase interoperability, and reduce long-term costs. This work saves the lives of emergency personnel and the citizens they serve.
The U.S. government is making an effort to overcome the nation's lack of public safety interoperability. The Department of Homeland Security's Office for Interoperability and Compatibility (OIC) is pursuing the SAFECOM [16] and CADIP and Project 25 programs, which are designed to help agencies as they integrate their CAD and other IT systems.
The OIC launched CADIP in August 2007. This project will partner the OIC with agencies in several locations, including Silicon Valley. This program will use case studies to identify the best practices and challenges associated with linking CAD systems across jurisdictional boundaries. These lessons will create the tools and resources public safety agencies can use to build interoperable CAD systems and communicate across local, state, and federal boundaries.
Governance entities can increase interoperability through their legislative and executive powers. For instance, in 2021 the European Commission, after commissioning two impact assessment studies and a technology analysis study, proposed the implementation of a standardization – for iterations of USB-C – of phone charger products, which may increase interoperability along with convergence and convenience for consumers while decreasing resource needs, redundancy and electronic waste. [17] [18] [19]
Desktop interoperability is a subset of software interoperability. In the early days, the focus of interoperability was to integrate web applications with other web applications. Over time, open-system containers were developed to create a virtual desktop environment in which these applications could be registered and then communicate with each other using simple publish–subscribe patterns. Rudimentary UI capabilities were also supported allowing windows to be grouped with other windows. Today, desktop interoperability has evolved into full-service platforms which include container support, basic exchange between web and web, but also native support for other application types and advanced window management. The very latest interop platforms also include application services such as universal search, notifications, user permissions and preferences, 3rd party application connectors and language adapters for in-house applications.
Search interoperability refers to the ability of two or more information collections to be searched by a single query. [20]
Specifically related to web-based search, the challenge of interoperability stems from the fact designers of web resources typically have little or no need to concern themselves with exchanging information with other web resources. Federated Search technology, which does not place format requirements on the data owner, has emerged as one solution to search interoperability challenges. In addition, standards, such as Open Archives Initiative Protocol for Metadata Harvesting, Resource Description Framework, and SPARQL, have emerged that also help address the issue of search interoperability related to web resources. Such standards also address broader topics of interoperability, such as allowing data mining.
With respect to software, the term interoperability is used to describe the capability of different programs to exchange data via a common set of exchange formats, to read and write the same file formats, and to use the same communication protocols. [a] The lack of interoperability can be a consequence of a lack of attention to standardization during the design of a program. Indeed, interoperability is not taken for granted in the non-standards-based portion of the computing world. [21]
According to ISO/IEC 2382-01, Information Technology Vocabulary, Fundamental Terms, interoperability is defined as follows: "The capability to communicate, execute programs, or transfer data among various functional units in a manner that requires the user to have little or no knowledge of the unique characteristics of those units". [22] [b]
Standards-developing organizations provide open public software specifications to facilitate interoperability; examples include the Oasis-Open organization and buildingSMART (formerly the International Alliance for Interoperability). Another example of a neutral party is the RFC documents from the Internet Engineering Task Force (IETF).
The Open Service for Lifecycle Collaboration [23] community is working on finding a common standard in order that software tools can share and exchange data e.g. bugs, tasks, requirements etc. The final goal is to agree on an open standard for interoperability of open source application lifecycle management tools. [24]
Java is an example of an interoperable programming language that allows for programs to be written once and run anywhere with a Java virtual machine. A program in Java, so long as it does not use system-specific functionality, will maintain interoperability with all systems that have a Java virtual machine available. Applications will maintain compatibility because, while the implementation is different, the underlying language interfaces are the same. [25]
Software interoperability is achieved through five interrelated ways:[ citation needed ]
Each of these has an important role in reducing variability in intercommunication software and enhancing a common understanding of the end goal to be achieved.
Interoperability tends to be regarded as an issue for experts and its implications for daily living are sometimes underrated. The European Union Microsoft competition case shows how interoperability concerns important questions of power relationships. In 2004, the European Commission found that Microsoft had abused its market power by deliberately restricting interoperability between Windows work group servers and non-Microsoft work group servers. By doing so, Microsoft was able to protect its dominant market position for work group server operating systems, the heart of corporate IT networks. Microsoft was ordered to disclose complete and accurate interface documentation, which could enable rival vendors to compete on an equal footing (the interoperability remedy).
Interoperability has also surfaced in the software patent debate in the European Parliament (June–July 2005). Critics claim that because patents on techniques required for interoperability are kept under RAND (reasonable and non-discriminatory licensing) conditions, customers will have to pay license fees twice: once for the product and, in the appropriate case, once for the patent-protected program the product uses.
Interoperability is often more of an organizational issue. Interoperability can have a significant impact on the organizations concerned, raising issues of ownership (do people want to share their data? or are they dealing with information silos?), labor relations (are people prepared to undergo training?) and usability. In this context, a more apt definition is captured in the term business process interoperability .
Interoperability can have important economic consequences; for example, research has estimated the cost of inadequate interoperability in the US capital facilities industry to be $15.8 billion a year. [31] If competitors' products are not interoperable (due to causes such as patents, trade secrets or coordination failures), the result may well be monopoly or market failure. For this reason, it may be prudent for user communities or governments to take steps to encourage interoperability in various situations. At least 30 international bodies and countries have implemented eGovernment-based interoperability framework initiatives called e-GIF while in the US there is the NIEM initiative. [32]
The need for plug-and-play interoperability – the ability to take a medical device out of its box and easily make it work with one's other devices – has attracted great attention from both healthcare providers and industry.
Increasingly, medical devices like incubators and imaging systems feature software that integrates at the point of care and with electronic systems, such as electronic medical records. At the 2016 Regulatory Affairs Professionals Society (RAPS) meeting, experts in the field like Angela N. Johnson with GE Healthcare and Jeff Shuren of the United States Food and Drug Administration provided practical seminars on how companies developing new medical devices, and hospitals installing them, can work more effectively to align interoperable software systems. [33]
Railways have greater or lesser interoperability depending on conforming to standards of gauge, couplings, brakes, signalling, loading gauge, and structure gauge to mention a few parameters. For passenger rail service, different railway platform height and width clearance standards may also affect interoperability.[ citation needed ]
North American freight and intercity passenger railroads are highly interoperable, but systems in Europe, Asia, Africa, Central and South America, and Australia are much less so. The parameter most difficult to overcome (at reasonable cost) is incompatibility of gauge, though variable gauge axle systems are increasingly used.[ citation needed ]
In telecommunications, the term can be defined as:
In two-way radio, interoperability is composed of three dimensions:[ citation needed ]
Many organizations are dedicated to interoperability. Some concentrate on eGovernment, eBusiness or data exchange in general.
Internationally, Network Centric Operations Industry Consortium facilitates global interoperability across borders, language and technical barriers. In the built environment, the International Alliance for Interoperability started in 1994, and was renamed buildingSMART in 2005. [36]
In Europe, the European Commission and its IDABC program issue the European Interoperability Framework. IDABC was succeeded by the Interoperability Solutions for European Public Administrations (ISA) program. They also initiated the Semantic Interoperability Centre Europe (SEMIC.EU). A European Land Information Service (EULIS) [37] was established in 2006, as a consortium of European National Land Registers. The aim of the service is to establish a single portal through which customers are provided with access to information about individual properties, about land and property registration services, and about the associated legal environment. [38]
The European Interoperability Framework (EIF) considered four kinds of interoperability: legal interoperability, organizational interoperability, semantic interoperability, and technical interoperability. [39]
In the European Research Cluster on the Internet of Things (IERC) and IoT Semantic Interoperability Best Practices; four kinds of interoperability are distinguished: syntactical interoperability, technical interoperability, semantic interoperability, and organizational interoperability. [40]
In the United States, the General Services Administration Component Organization and Registration Environment (CORE.GOV) initiative provided a collaboration environment for component development, sharing, registration, and reuse in the early 2000s. [41] A related initiative is the ongoing National Information Exchange Model (NIEM) work and component repository. [42] The National Institute of Standards and Technology serves as an agency for measurement standards.
An extranet is a controlled private network that allows access to partners, vendors and suppliers or an authorized set of customers – normally to a subset of the information accessible from an organization's intranet. An extranet is similar to a DMZ in that it provides access to needed services for authorized parties, without granting access to an organization's entire network.
An open standard is a standard that is openly accessible and usable by anyone. It is also a common prerequisite that open standards use an open license that provides for extensibility. Typically, anybody can participate in their development due to their inherently open nature. There is no single definition, and interpretations vary with usage. Examples of open standards include the GSM, 4G, and 5G standards that allow most modern mobile phones to work world-wide.
Health Level Seven, abbreviated to HL7, is a range of global standards for the transfer of clinical and administrative health data between applications with the aim to improve patient outcomes and health system performance. The HL7 standards focus on the application layer, which is "layer 7" in the Open Systems Interconnection model. The standards are produced by Health Level Seven International, an international standards organization, and are adopted by other standards issuing bodies such as American National Standards Institute and International Organization for Standardization. There are a range of primary standards that are commonly used across the industry, as well as secondary standards which are less frequently adopted.
Message-oriented middleware (MOM) is software or hardware infrastructure supporting sending and receiving messages between distributed systems. Message-oriented middleware is in contrast to streaming-oriented middleware where data is communicated as a sequence of bytes with no explicit message boundaries. Note that streaming protocols are almost always built above protocols using discrete messages such as frames (Ethernet), datagrams (UDP), packets (IP), cells (ATM), et al.
Enterprise application integration (EAI) is the use of software and computer systems' architectural principles to integrate a set of enterprise computer applications.
Open Platform Communications (OPC) is a series of standards and specifications for industrial telecommunication. They are based on Object Linking and Embedding (OLE) for process control. An industrial automation task force developed the original standard in 1996 under the name OLE for Process Control. OPC specifies the communication of real-time plant data between control devices from different manufacturers.
The Department of Defense Architecture Framework (DoDAF) is an architecture framework for the United States Department of Defense (DoD) that provides visualization infrastructure for specific stakeholders concerns through viewpoints organized by various views. These views are artifacts for visualizing, understanding, and assimilating the broad scope and complexities of an architecture description through tabular, structural, behavioral, ontological, pictorial, temporal, graphical, probabilistic, or alternative conceptual means. The current release is DoDAF 2.02.
Semantic interoperability is the ability of computer systems to exchange data with unambiguous, shared meaning. Semantic interoperability is a requirement to enable machine computable logic, inferencing, knowledge discovery, and data federation between information systems.
Knowledge Discovery Metamodel (KDM) is a publicly available specification from the Object Management Group (OMG). KDM is a common intermediate representation for existing software systems and their operating environments, that defines common metadata required for deep semantic integration of Application Lifecycle Management tools. KDM was designed as the OMG's foundation for software modernization, IT portfolio management and software assurance. KDM uses OMG's Meta-Object Facility to define an XMI interchange format between tools that work with existing software as well as an abstract interface (API) for the next-generation assurance and modernization tools. KDM standardizes existing approaches to knowledge discovery in software engineering artifacts, also known as software mining.
IEC 60870 part 6 in electrical engineering and power system automation, is one of the IEC 60870 set of standards which define systems used for telecontrol in electrical engineering and power system automation applications. The IEC Technical Committee 57 have developed part 6 to provide a communication profile for sending basic telecontrol messages between two systems which is compatible with ISO standards and ITU-T recommendations.
Service Component Architecture (SCA) is a software technology designed to provide a model for applications that follow service-oriented architecture principles. The technology, created by major software vendors, including IBM, Oracle Corporation and TIBCO Software, encompasses a wide range of technologies and as such is specified in independent specifications to maintain programming language and application environment neutrality. Many times it uses an enterprise service bus (ESB).
Smart-M3 is a name of an open-source software project that aims to provide a Semantic Web information sharing infrastructure between software entities and devices. It combines the ideas of distributed, networked systems and semantic web. The ultimate goal is to enable smart environments and linking of real and virtual worlds.
Federated architecture (FA) is a pattern in enterprise architecture that allows interoperability and information sharing between semi-autonomous de-centrally organized lines of business (LOBs), information technology systems and applications.
Open Automated Demand Response (OpenADR) is a research and standards development effort for energy management led by North American research labs and companies. The typical use is to send information and signals to cause electrical power-using devices to be turned off during periods of high demand.
The Open Geospatial Consortium (OGC) is an international voluntary consensus standards organization that develops and maintains international standards for geospatial content and location-based services, sensor web, Internet of Things, GIS data processing and data sharing. The OGC was incorporated as a not for profit in 1994. At that time, the official name was the OpenGIS Consortium. Currently, commercial, government, nonprofit, universities, and research organizations participate in a consensus process encouraging development, maintenance, and implementation of open standards.
The Open Group Future Airborne Capability Environment was formed in 2010 to define an open avionics environment for all military airborne platform types. Today, it is a real-time software-focused professional group made up of industry suppliers, customers, academia, and users. The FACE approach is a government-industry software standard and business strategy for acquisition of affordable software systems that promotes innovation and rapid integration of portable capabilities across programs. The FACE Consortium provides a vendor-neutral forum for industry and government to work together to develop and consolidate the open standards, best practices, guidance documents, and business strategy necessary to result in:
Unified interoperability is the property of a system that allows for the integration of real-time and non-real time communications, activities, data, and information services and the display and coordination of those services across systems and devices. Unified interoperability provides the capability to communicate and exchange processing across different applications, data, and infrastructure.
The Banking Industry Architecture Network e.V. (BIAN) is an independent, member owned, not-for-profit association to establish and promote a common architectural framework for enabling banking interoperability. It was established in 2008.
The Physical Security Interoperability Alliance (PSIA) is a global consortium of more than 65 physical security manufacturers and systems integrators focused on promoting interoperability of IP-enabled security devices and systems across the physical security ecosystem as well as enterprise and building automation systems.
Health Level Seven International (HL7) is a non-profit ANSI-accredited standards development organization that develops standards that provide for global health data interoperability.
{{cite web}}
: CS1 maint: others (link){{cite web}}
: CS1 maint: unfit URL (link){{cite web}}
: CS1 maint: bot: original URL status unknown (link){{cite web}}
: CS1 maint: numeric names: authors list (link)This article's use of external links may not follow Wikipedia's policies or guidelines.(August 2016) |