Event Processing Technical Society

Last updated
Event Processing Technical Society
FounderEvent Processing Technical Society
Focus event processing
OriginsEvent Processing Technical Society
Key people
Opher Etzion, David Luckham, Dieter Gawlick, Pedro Bizarro, Adrian Paschke, Paul Vincent, Shailendra Mishra

Event Processing Technical Society (EPTS) is an inclusive group of organizations and individuals aiming to increase awareness of event processing, foster topics for future standardization, and establish event processing as a separate academic discipline.

Contents

Motivation

The goal of the EPTS is development of shared understanding of event processing terminology. The society believes that through communicating the shared understanding developed within the group it would become a catalyst for emergence of effective interoperation standards, would foster academic research, and creation of training curriculum. In turn it would lead to establishment of event processing as a discipline in its own right. The society is trying to follow example of the Database technology when relational theory provided theoretical foundation and homogenized the technology through introduction of Structured Query Language (SQL). The EPTS members hope that through combination of academic research, vendor experience and customer data they will be able develop a unified glossary, language, and architecture that would homogenize Event Processing in the similar way.

Organization

The EPTS is organized into several working groups.

Use Case Working Group

The Use Case WG collects and documents variety of usage scenarios of event processing in broad spectrum of applications. in order to classify such applications; The group has already collected use cases from Enterprise Information Technology Management, Fraud Detection, Business Process Management, Health Care, and Stock Trading. They have also created a comprehensive questionnaire to capture various facets of use cases. This data is used as input by the Architecture Working Group.

Architecture and Meta-Model Working Group

The Architecture WG attempts to build a reference architecture for event processing. Since 2009 the meta-model working group has been merged with the reference architecture working group. The Meta Model WG serves as a liaison to a number of standards bodies. Members of this group are usually members of standards organizations such as OASIS, W3C, RuleML, OMG, DMTF and others - see the Event Processing standards reference model. [1] The first version of the EPTS reference architecture is published. [2] [3] [4]

Language Analysis Working Group

The Language Analysis WG is collecting and organizing examples of various event processing languages used in industry and research in order to extract the language dimensions.

Interoperability Working Group

The Interoperability WG is studying requirements for interoperability. Its goal is to get to a set of agreed mechanisms that would allow interoperability between event processing systems produced by different vendors.

Glossary Working Group

The Glossary WG is developing a glossary of terms for Event Processing. The first version of the glossary [5] is already published. The glossary working group was led and its output was edited by Roy Schulte and David Luckham.

History

The society started as an informal group in 2005/2006. It was formally launched as a consortium in June 2008. Membership of the consortium is based on a formal agreement defining IP ownership terms and rules of engagement. The society is governed by a steering committee consisting of founding members of the organization, representatives of major vendors and scientists. It is partner of the major scientific event processing conference: Distributed Event Based Systems (DEBS), the major scientific rules conference: International Web Rule Symposium (RuleML) and also launched two Dagstuhl seminars on event processing, one in May 2007, and the second was held in May 2010. The event processing community is still active, but the EPTS seems to be in a passive state at the moment. At least the website of the society is not active any more since mid of 2014.

Conferences

See also

Related Research Articles

<span class="mw-page-title-main">Interoperability</span> Ability of systems to work with each other

Interoperability is a characteristic of a product or system to work with other products or systems. While the term was initially defined for information technology or systems engineering services to allow for information exchange, a broader definition takes into account social, political, and organizational factors that impact system-to-system performance.

The Common Object Request Broker Architecture (CORBA) is a standard defined by the Object Management Group (OMG) designed to facilitate the communication of systems that are deployed on diverse platforms. CORBA enables collaboration between systems on different operating systems, programming languages, and computing hardware. CORBA uses an object-oriented model although the systems that use the CORBA do not have to be object-oriented. CORBA is an example of the distributed object paradigm.

An open standard is a standard that is openly accessible and usable by anyone. It is also a common prerequisite that open standards use an open license that provides for extensibility. Typically, anybody can participate in their development due to their inherently open nature. There is no single definition, and interpretations vary with usage. Examples of open standards include the GSM, 4G, and 5G standards that allow most modern mobile phones to work world-wide.

Model Driven Architecture (MDA) is a software design approach for the development of software systems. It provides a set of guidelines for the structuring of specifications, which are expressed as models. Model Driven Architecture is a kind of domain engineering, and supports model-driven engineering of software systems. It was launched by the Object Management Group (OMG) in 2001.

Message-oriented middleware (MOM) is software or hardware infrastructure supporting sending and receiving messages between distributed systems. MOM allows application modules to be distributed over heterogeneous platforms and reduces the complexity of developing applications that span multiple operating systems and network protocols. The middleware creates a distributed communications layer that insulates the application developer from the details of the various operating systems and network interfaces. APIs that extend across diverse platforms and networks are typically provided by MOM.

Event processing is a method of tracking and analyzing (processing) streams of information (data) about things that happen (events), and deriving a conclusion from them. Complex event processing (CEP) consists of a set of concepts and techniques developed in the early 1990s for processing real-time events and extracting information from event streams as they arrive. The goal of complex event processing is to identify meaningful events in real-time situations and respond to them as quickly as possible.

The Data Distribution Service (DDS) for real-time systems is an Object Management Group (OMG) machine-to-machine standard that aims to enable dependable, high-performance, interoperable, real-time, scalable data exchanges using a publish–subscribe pattern.

<span class="mw-page-title-main">DNP3</span> Computer network protocol

Distributed Network Protocol 3 (DNP3) is a set of communications protocols used between components in process automation systems. Its main use is in utilities such as electric and water companies. Usage in other industries is not common. It was developed for communications between various types of data acquisition and control equipment. It plays a crucial role in SCADA systems, where it is used by SCADA Master Stations, Remote Terminal Units (RTUs), and Intelligent Electronic Devices (IEDs). It is primarily used for communications between a master station and RTUs or IEDs. ICCP, the Inter-Control Center Communications Protocol, is used for inter-master station communications. Competing standards include the older Modbus protocol and the newer IEC 61850 protocol.

The international standard IEC 61499, addressing the topic of function blocks for industrial process measurement and control systems, was initially published by the International Electrotechnical Commission (IEC) in 2005. The specification of IEC 61499 defines a generic model for distributed control systems and is based on the IEC 61131 standard. The concepts of IEC 61499 are also explained by Lewis and Zoitl as well as Vyatkin.

<span class="mw-page-title-main">Systems modeling language</span> General-purpose modeling language

The systems modeling language (SysML) is a general-purpose modeling language for systems engineering applications. It supports the specification, analysis, design, verification and validation of a broad range of systems and systems-of-systems.

<span class="mw-page-title-main">RM-ODP</span> Reference model in computer science

Reference Model of Open Distributed Processing (RM-ODP) is a reference model in computer science, which provides a co-ordinating framework for the standardization of open distributed processing (ODP). It supports distribution, interworking, platform and technology independence, and portability, together with an enterprise architecture framework for the specification of ODP systems.

The Pragmatic Web consists of the tools, practices and theories describing why and how people use information. In contrast to the Syntactic Web and Semantic Web the Pragmatic Web is not only about form or meaning of information, but about social interaction which brings about e.g. understanding or commitments.

The Multicore Association was founded in 2005. Multicore Association is a member-funded, non-profit, industry consortium focused on the creation of open standard APIs, specifications, and guidelines that allow system developers and programmers to more readily adopt multicore technology into their applications.

David Luckham is an emeritus professor of electrical engineering at Stanford University. As a graduate student at the Massachusetts Institute of Technology (MIT), he was one of the implementers of the first systems for the programming language Lisp.

The Web platform is a collection of technologies developed as open standards by the World Wide Web Consortium and other standardization bodies such as the Web Hypertext Application Technology Working Group, the Unicode Consortium, the Internet Engineering Task Force, and Ecma International. It is the umbrella term introduced by the World Wide Web Consortium, and in 2011 it was defined as "a platform for innovation, consolidation and cost efficiencies" by W3C CEO Jeff Jaffe. Being built on The evergreen Web has allowed for the addition of new capabilities while addressing security and privacy risks. Additionally, developers are enabled to build interoperable content on a cohesive platform.

<span class="mw-page-title-main">Open Geospatial Consortium</span> Standards organization

The Open Geospatial Consortium (OGC), an international voluntary consensus standards organization for geospatial content and location-based services, sensor web and Internet of Things, GIS data processing and data sharing. It originated in 1994 and involves more than 500 commercial, governmental, nonprofit and research organizations in a consensus process encouraging development and implementation of open standards.

<span class="mw-page-title-main">Integrating the Healthcare Enterprise</span> Non-profit organization

Integrating the Healthcare Enterprise (IHE) is a non-profit organization based in the US state of Illinois. It sponsors an initiative by the healthcare industry to improve the way computer systems share information. IHE was established in 1998 by a consortium of radiologists and information technology (IT) experts.

<span class="mw-page-title-main">Enterprise Architect (software)</span> Visual modeling and design tool

Sparx Systems Enterprise Architect is a visual modeling and design tool based on the OMG UML. The platform supports: the design and construction of software systems; modeling business processes; and modeling industry based domains. It is used by businesses and organizations to not only model the architecture of their systems, but to process the implementation of these models across the full application development life-cycle.

References

  1. Paschke, Adrian; Vincent, Paul; Springer, Florian (2011). "Standards for Complex Event Processing and Reaction Rules". Rule-Based Modeling and Computing on the Semantic Web. Berlin, Heidelberg: Springer Berlin Heidelberg. pp. 128–139. doi:10.1007/978-3-642-24908-2_17. ISBN   978-3-642-24907-5. ISSN   0302-9743., (presentation)
  2. Adrian Paschke, Paul Vincent, Alexandre Alves, Catherine Moxey: Tutorial on advanced design patterns in event processing. DEBS 2012: 324-334 (presentation)
  3. Adrian Paschke, Paul Vincent,Alexandre Alves, Catherine Moxey: Architectural and functional design patterns for event processing. DEBS 2011: 363-364 (presentation)
  4. Paschke,A., Vincent, P., and Catherine Moxey: Event Processing Architectures, Fourth ACM International Conference on Distributed Event-Based Systems (DEBS '10). ACM, Cambridge, UK, 2010. (presentation)
  5. Event Processing Technical Society, Editors: D. Luckham, R. Schulte, Event Processing Glossary - Version 1.1, July 2008

Blogs