Trace theory

Last updated

In mathematics and computer science, trace theory aims to provide a concrete mathematical underpinning for the study of concurrent computation and process calculi. The underpinning is provided by an algebraic definition of the free partially commutative monoid or trace monoid, or equivalently, the history monoid, which provides a concrete algebraic foundation, analogous to the way that the free monoid provides the underpinning for formal languages.

The power of trace theory stems from the fact that the algebra of dependency graphs (such as Petri nets) is isomorphic to that of trace monoids, and thus, one can apply both algebraic formal language tools, as well as tools from graph theory.

While the trace monoid had been studied by Pierre Cartier and Dominique Foata for its combinatorics in the 1960s, trace theory was first formulated by Antoni Mazurkiewicz in the 1970s, in an attempt to evade some of the problems in the theory of concurrent computation, including the problems of interleaving and non-deterministic choice with regards to refinement in process calculi.

Related Research Articles

Combinatorics is an area of mathematics primarily concerned with counting, both as a means and an end in obtaining results, and certain properties of finite structures. It is closely related to many other areas of mathematics and has many applications ranging from logic to statistical physics and from evolutionary biology to computer science.

Discrete mathematics Study of discrete mathematical structures

Discrete mathematics is the study of mathematical structures that can be considered "discrete" rather than "continuous". Objects studied in discrete mathematics include integers, graphs, and statements in logic. By contrast, discrete mathematics excludes topics in "continuous mathematics" such as real numbers, calculus or Euclidean geometry. Discrete objects can often be enumerated by integers; more formally, discrete mathematics has been characterized as the branch of mathematics dealing with countable sets. However, there is no exact definition of the term "discrete mathematics".

Formal language Sequence of words formed by specific rules

In logic, mathematics, computer science, and linguistics, a formal language consists of words whose letters are taken from an alphabet and are well-formed according to a specific set of rules.

Monoid Algebraic structure with an associative operation and an identity element

In abstract algebra, a branch of mathematics, a monoid is a set equipped with an associative binary operation and an identity element.

In theoretical computer science and formal language theory, a regular language is a formal language that can be defined by a regular expression, in the strict sense in theoretical computer science.

In programming language theory, semantics is the field concerned with the rigorous mathematical study of the meaning of programming languages. It does so by evaluating the meaning of syntactically valid strings defined by a specific programming language, showing the computation involved. In such a case that the evaluation would be of syntactically invalid strings, the result would be non-computation. Semantics describes the processes a computer follows when executing a program in that specific language. This can be shown by describing the relationship between the input and output of a program, or an explanation of how the program will be executed on a certain platform, hence creating a model of computation.

In mathematics and computer science, the Krohn–Rhodes theory is an approach to the study of finite semigroups and automata that seeks to decompose them in terms of elementary components. These components correspond to finite aperiodic semigroups and finite simple groups that are combined in a feedback-free manner.

In abstract algebra, the free monoid on a set is the monoid whose elements are all the finite sequences of zero or more elements from that set, with string concatenation as the monoid operation and with the unique sequence of zero elements, often called the empty string and denoted by ε or λ, as the identity element. The free monoid on a set A is usually denoted A. The free semigroup on A is the subsemigroup of A containing all elements except the empty string. It is usually denoted A+.

In computer science, the process calculi are a diverse family of related approaches for formally modelling concurrent systems. Process calculi provide a tool for the high-level description of interactions, communications, and synchronizations between a collection of independent agents or processes. They also provide algebraic laws that allow process descriptions to be manipulated and analyzed, and permit formal reasoning about equivalences between processes. Leading examples of process calculi include CSP, CCS, ACP, and LOTOS. More recent additions to the family include the π-calculus, the ambient calculus, PEPA, the fusion calculus and the join-calculus.

Concurrency (computer science) Ability of different parts or units of a program, algorithm, or problem to be executed out-of-order or in partial order, without affecting the final outcome

In computer science, concurrency is the ability of different parts or units of a program, algorithm, or problem to be executed out-of-order or in partial order, without affecting the final outcome. This allows for parallel execution of the concurrent units, which can significantly improve overall speed of the execution in multi-processor and multi-core systems. In more technical terms, concurrency refers to the decomposability of a program, algorithm, or problem into order-independent or partially-ordered components or units of computation.

In computer science, graph transformation, or graph rewriting, concerns the technique of creating a new graph out of an original graph algorithmically. It has numerous applications, ranging from software engineering to layout algorithms and picture generation.

In computer science, a trace is a set of strings, wherein certain letters in the string are allowed to commute, but others are not. It generalizes the concept of a string, by not forcing the letters to always be in a fixed order, but allowing certain reshufflings to take place. Traces were introduced by Pierre Cartier and Dominique Foata in 1969 to give a combinatorial proof of MacMahon's Master theorem. Traces are used in theories of concurrent computation, where commuting letters stand for portions of a job that can execute independently of one another, while non-commuting letters stand for locks, synchronization points or thread joins.

In theoretical computer science and mathematics, especially in the area of combinatorics on words, the Levi lemma states that, for all strings u, v, x and y, if uv = xy, then there exists a string w such that either

In mathematics and computer science, a history monoid is a way of representing the histories of concurrently running computer processes as a collection of strings, each string representing the individual history of a process. The history monoid provides a set of synchronization primitives for providing rendezvous points between a set of independently executing processes or threads.

Combinatorics on words is a fairly new field of mathematics, branching from combinatorics, which focuses on the study of words and formal languages. The subject looks at letters or symbols, and the sequences they form. Combinatorics on words affects various areas of mathematical study, including algebra and computer science. There have been a wide range of contributions to the field. Some of the first work was on square-free words by Axel Thue in the early 1900s. He and colleagues observed patterns within words and tried to explain them. As time went on, combinatorics on words became useful in the study of algorithms and coding. It led to developments in abstract algebra and answering open questions.

Grzegorz Rozenberg

Grzegorz Rozenberg is a Polish and Dutch computer scientist.

A term graph is a representation of an expression in a formal language as a generalized graph whose vertices are terms. Term graphs are a more powerful form of representation than expression trees because they can represent not only common subexpressions but also cyclic/recursive subexpressions.

Hans-Jörg Kreowski is a professor for computer science at the University of Bremen in North West Germany. His primary research area is theoretical computer science with an emphasis on graph transformation, algebraic specification, and syntactic picture processing. He is also a member of the Forum of Computer Scientists for Peace and Social Responsibility (FIfF).

In mathematics, the Muller–Schupp theorem states that a finitely generated group G has context-free word problem if and only if G is virtually free. The theorem was proved by David Muller and Paul Schupp in 1983.

Hartmut Ehrig was a German computer scientist and professor of theoretical computer science and formal specification. He was a pioneer in algebraic specification of abstract data types, and in graph grammars.

References