Extended ML

Last updated
Extended ML
Paradigms Multi-paradigm: functional, imperative, modular
Family ML: Standard ML
Designed by S. Kahrs, D. Sannella,
A. Tarlecki
Developer University of Edinburgh
First appeared1985;39 years ago (1985)
Final release
1.1 / 1999;25 years ago (1999)
Typing discipline strong, static, inferred
Platform IA-32, SPARC
OS Cross-platform: Linux, Solaris
Website homepages.inf.ed.ac.uk/dts/eml [1]
Influenced by
ML, Standard ML

Extended ML is a general-purpose, high-level, wide-spectrum programming language based on the languages ML and Standard ML, covering both program specification and implementation. It extends the syntax of ML to include axioms, which do not need to be executable but can rigorously specify the behavior of a program. With this addition, the language can be used for stepwise refinement, proceeding gradually from an initial formal specification to eventually yield an executable Standard ML program. Correctness of the final executable with respect to the original specification can then be established by proving the correctness of each of the refinement steps. Extended ML is used for research into and teaching of formal methods in program development and specification, and research into automatic program verification.

Extended ML is neither related to the programming language Extensible ML (other than being similarly derived from ML), nor to the specification language Extensible Markup Language (XML).

Related Research Articles

In computer science, formal methods are mathematically rigorous techniques for the specification, development, analysis, and verification of software and hardware systems. The use of formal methods for software and hardware design is motivated by the expectation that, as in other engineering disciplines, performing appropriate mathematical analysis can contribute to the reliability and robustness of a design.

Software design is the process of conceptualizing how a software system will work before it is implemented or modified. Software design also refers to the direct result of the design process – the concepts of how the software will work which consists of both design documentation and undocumented concepts.

Extensible programming is a term used in computer science to describe a style of computer programming that focuses on mechanisms to extend the programming language, compiler, and runtime system (environment). Extensible programming languages, supporting this style of programming, were an active area of work in the 1960s, but the movement was marginalized in the 1970s. Extensible programming has become a topic of renewed interest in the 21st century.

Refinement is a generic term of computer science that encompasses various approaches for producing correct computer programs and simplifying existing programs to enable their formal verification.

The notion of institution was created by Joseph Goguen and Rod Burstall in the late 1970s, in order to deal with the "population explosion among the logical systems used in computer science". The notion attempts to "formalize the informal" concept of logical system.

In computer science, an abstract state machine (ASM) is a state machine operating on states that are arbitrary data structures.

Cactus is an open-source, problem-solving environment designed for scientists and engineers. Its modular structure enables parallel computation across different architectures and collaborative code development between different groups. Cactus originated in the academic research community, where it was developed and used over many years by a large international collaboration of physicists and computational scientists.

Rodney Martineau "Rod" Burstall Fellowship of the Royal Society of Edinburgh (FRSE) is a British computer scientist and one of four founders of the Laboratory for Foundations of Computer Science at the University of Edinburgh.

In computing, subject-oriented programming is an object-oriented software paradigm in which the state (fields) and behavior (methods) of objects are not seen as intrinsic to the objects themselves, but are provided by various subjective perceptions ("subjects") of the objects. The term and concepts were first published in September 1993 in a conference paper which was later recognized as being one of the three most influential papers to be presented at the conference between 1986 and 1996. As illustrated in that paper, an analogy is made with the contrast between the philosophical views of Plato and Kant with respect to the characteristics of "real" objects, but applied to software ones. For example, while we may all perceive a tree as having a measurable height, weight, leaf-mass, etc., from the point of view of a bird, a tree may also have measures of relative value for food or nesting purposes, or from the point of view of a tax-assessor, it may have a certain taxable value in a given year. Neither the bird's nor the tax-assessor's additional state information need be seen as intrinsic to the tree, but are added by the perceptions of the bird and tax-assessor, and from Kant's analysis, the same may be true even of characteristics we think of as intrinsic.

A wide-spectrum language (WSL) is a programming language designed to be simultaneously a low-level and a high-level language—possibly a non-executable specification language. Wide-spectrum languages are designed to support a programming methodology based on program refinement.

In type theory, a refinement type is a type endowed with a predicate which is assumed to hold for any element of the refined type. Refinement types can express preconditions when used as function arguments or postconditions when used as return types: for instance, the type of a function which accepts natural numbers and returns natural numbers greater than 5 may be written as . Refinement types are thus related to behavioral subtyping.

Spec Explorer is a Model-Based Testing (MBT) tool from Microsoft. It extends the Visual Studio Integrated Development Environment with the ability to define a model describing the expected behavior of a software system. From these models, the tool can generate tests automatically for execution within Visual Studio's own testing framework, or many other unit testing frameworks.

In computing, algorithmic skeletons, or parallelism patterns, are a high-level parallel programming model for parallel and distributed computing.

Typestate analysis, sometimes called protocol analysis, is a form of program analysis employed in programming languages. It is most commonly applied to object-oriented languages. Typestates define valid sequences of operations that can be performed upon an instance of a given type. Typestates, as the name suggests, associate state information with variables of that type. This state information is used to determine at compile-time which operations are valid to be invoked upon an instance of the type. Operations performed on an object that would usually only be executed at run-time are performed upon the type state information which is modified to be compatible with the new state of the object.

Bernhard Steffen is a German computer scientist and professor at the TU Dortmund University, Germany. His research focuses on various facets of formal methods ranging from program analysis and verification, to workflow synthesis, to test-based modeling, and machine learning.

<span class="mw-page-title-main">Grigori Fursin</span> British computer scientist

Grigori Fursin is a British computer scientist, president of the non-profit CTuning foundation, founding member of MLCommons, co-chair of the MLCommons Task Force on Automation and Reproducibility and founder of cKnowledge. His research group created open-source machine learning based self-optimizing compiler, MILEPOST GCC, considered to be the first in the world. At the end of the MILEPOST project he established cTuning foundation to crowdsource program optimisation and machine learning across diverse devices provided by volunteers. His foundation also developed Collective Knowledge Framework to support open research. Since 2015 Fursin leads Artifact Evaluation at several ACM and IEEE computer systems conferences. He is also a founding member of the ACM taskforce on Data, Software, and Reproducibility in Publication.

<span class="mw-page-title-main">Grigore Roșu</span> Computer science professor

Grigore Roșu is a computer science professor at the University of Illinois at Urbana-Champaign and a researcher in the Information Trust Institute. He is known for his contributions in runtime verification, the K framework, matching logic, and automated coinduction.

References

  1. "Extended ML". University of Edinburgh . Scotland.