Ptolemy Project

Last updated
Ptolemy II
Developer(s) University of California, Berkeley
Stable release
11.0.1 / 2018-06-18
Operating system Linux, Mac OS X, Windows
Type Model based design, visual programming language
License BSD License
Website ptolemy.berkeley.edu

The Ptolemy Project is an ongoing project aimed at modeling, simulating, and designing concurrent, real-time, embedded systems. The focus of the Ptolemy Project is on assembling concurrent components. The principal product of the project is the Ptolemy II model based design and simulation tool. The Ptolemy Project is conducted in the Industrial Cyber-Physical Systems Center (iCyPhy) in the Department of Electrical Engineering and Computer Sciences of the University of California at Berkeley, and is directed by Prof. Edward A. Lee.

Concurrency (computer science)

In computer science, concurrency is the ability of different parts or units of a program, algorithm, or problem to be executed out-of-order or in partial order, without affecting the final outcome. This allows for parallel execution of the concurrent units, which can significantly improve overall speed of the execution in multi-processor and multi-core systems. In more technical terms, concurrency refers to the decomposability property of a program, algorithm, or problem into order-independent or partially-ordered components or units.

In computer science, real-time computing (RTC), or reactive computing describes hardware and software systems subject to a "real-time constraint", for example from event to system response. Real-time programs must guarantee response within specified time constraints, often referred to as "deadlines". The correctness of these types of systems depends on their temporal aspects as well as their functional aspects. Real-time responses are often understood to be in the order of milliseconds, and sometimes microseconds. A system not specified as operating in real time cannot usually guarantee a response within any timeframe, although typical or expected response times may be given.

Embedded system computer system with a dedicated function within a larger mechanical or electrical system

An embedded system is a controller programmed and controlled by a real-time operating system (RTOS) with a dedicated function within a larger mechanical or electrical system, often with real-time computing constraints. It is embedded as part of a complete device often including hardware and mechanical parts. Embedded systems control many devices in common use today. Ninety-eight percent of all microprocessors manufactured are used in embedded systems.

The key underlying principle in the project is the use of well-defined models of computation that govern the interaction between components. A major problem area being addressed is the use of heterogeneous mixtures of models of computation. [1]

In computer science, and more specifically in computability theory and computational complexity theory, a model of computation is a model which describes how an output of a mathematical function is computed given an input. A model describes how units of computations, memories, and communications are organized. The computational complexity of an algorithm can be measured given a model of computation. Using a model allows studying the performance of algorithms independently of the variations that are specific to particular implementations and specific technology.

The project is named after Claudius Ptolemaeus, the 2nd century Greek astronomer, mathematician, and geographer.

Ptolemy 2nd-century Greco-Egyptian writer and astronomer

Claudius Ptolemy was a Greco-Roman mathematician, astronomer, geographer and astrologer. He lived in the city of Alexandria in the Roman province of Egypt, wrote in Koine Greek, and held Roman citizenship. The 14th-century astronomer Theodore Meliteniotes gave his birthplace as the prominent Greek city Ptolemais Hermiou in the Thebaid. This attestation is quite late, however, and, according to Gerald Toomer, the translator of his Almagest into English, there is no reason to suppose he ever lived anywhere other than Alexandria. He died there around AD 168.

The Kepler Project, a community-driven collaboration among researchers at three other University of California campuses has created the Kepler scientific workflow system which is based on Ptolemy II.

University of California public university system in California

The University of California (UC) is a public university system in the U.S. state of California. Under the California Master Plan for Higher Education, the University of California is a part of the state's three-system public higher education plan, which also includes the California State University system and the California Community Colleges System.

Kepler is a free software system for designing, executing, reusing, evolving, archiving, and sharing scientific workflows. Kepler's facilities provide process and data monitoring, provenance information, and high-speed data movement. Workflows in general, and scientific workflows in particular, are directed graphs where the nodes represent discrete computational components, and the edges represent paths along which data and results can flow between components. In Kepler, the nodes are called 'Actors' and the edges are called 'channels'. Kepler includes a graphical user interface for composing workflows in a desktop environment, a runtime engine for executing workflows within the GUI and independently from a command-line, and a distributed computing option that allows workflow tasks to be distributed among compute nodes in a computer cluster or computing grid. The Kepler system principally targets the use of a workflow metaphor for organizing computational tasks that are directed towards particular scientific analysis and modeling goals. Thus, Kepler scientific workflows generally model the flow of data from one step to another in a series of computations that achieve some scientific goal.

Related Research Articles

Distributed computing is a field of computer science that studies distributed systems. A distributed system is a system whose components are located on different networked computers, which communicate and coordinate their actions by passing messages to one another. The components interact with one another in order to achieve a common goal. Three significant characteristics of distributed systems are: concurrency of components, lack of a global clock, and independent failure of components. Examples of distributed systems vary from SOA-based systems to massively multiplayer online games to peer-to-peer applications.

Geocentric model Theory that Earth is the centre of the Universe

In astronomy, the geocentric model is a superseded description of the Universe with Earth at the center. Under the geocentric model, the Sun, Moon, stars, and planets all orbited Earth. The geocentric model was the predominant description of the cosmos in many ancient civilizations, such as those of Aristotle and Ptolemy.

In the Hipparchian and Ptolemaic systems of astronomy, the epicycle was a geometric model used to explain the variations in speed and direction of the apparent motion of the Moon, Sun, and planets. In particular it explained the apparent retrograde motion of the five planets known at the time. Secondarily, it also explained changes in the apparent distances of the planets from the Earth.

Celestial spheres Term in ancient times for the heavens

The celestial spheres, or celestial orbs, were the fundamental entities of the cosmological models developed by Plato, Eudoxus, Aristotle, Ptolemy, Copernicus, and others. In these celestial models, the apparent motions of the fixed stars and planets are accounted for by treating them as embedded in rotating spheres made of an aetherial, transparent fifth element (quintessence), like jewels set in orbs. Since it was believed that the fixed stars did not change their positions relative to one another, it was argued that they must be on the surface of a single starry sphere.

Equant

Equant is a mathematical concept developed by Claudius Ptolemy in the 2nd century AD to account for the observed motion of the planets. The equant is used to explain the observed speed change in planetary orbit during different stages of the orbit. This planetary concept allowed Ptolemy to keep the theory of uniform circular motion alive by stating that the path of heavenly bodies was uniform around one point and circular around another point.

Autonomic computing refers to the self-managing characteristics of distributed computing resources, adapting to unpredictable changes while hiding intrinsic complexity to operators and users. Initiated by IBM in 2001, this initiative ultimately aimed to develop computer systems capable of self-management, to overcome the rapidly growing complexity of computing systems management, and to reduce the barrier that complexity poses to further growth.

The actor model in computer science is a mathematical model of concurrent computation that treats "actors" as the universal primitives of concurrent computation. In response to a message that it receives, an actor can: make local decisions, create more actors, send more messages, and determine how to respond to the next message received. Actors may modify their own private state, but can only affect each other through messages.

Alfonsine tables medieval astronomical work

The Alfonsine tables, sometimes spelled Alphonsine tables, provided data for computing the position of the Sun, Moon and planets relative to the fixed stars.

Concurrent computing is a form of computing in which several computations are executed during overlapping time periods—concurrently—instead of sequentially. This is a property of a system—this may be an individual program, a computer, or a network—and there is a separate execution point or "thread of control" for each computation ("process"). A concurrent system is one where a computation can advance without waiting for all other computations to complete.

Indeterminacy in concurrent computation is concerned with the effects of indeterminacy in concurrent computation. Computation is an area in which indeterminacy is becoming increasingly important because of the massive increase in concurrency due to networking and the advent of many-core computer architectures. These computer systems make use of arbiters which give rise to indeterminacy.

<i>Rudolphine Tables</i> astronomical data

The Rudolphine Tables consist of a star catalogue and planetary tables published by Johannes Kepler in 1627, using some observational data collected by Tycho Brahe (1546–1601). The tables are named as "Rudolphine" in memory of Rudolf II, Holy Roman Emperor.

Gul Agha is a professor of computer science at the University of Illinois at Urbana-Champaign, and director of the Open Systems Laboratory. He is known for his work on the actor model of concurrent computation, and was also Editor-in-Chief of ACM Computing Surveys from 1999 to 2007. Agha completed his B.S. with honors from the California Institute of Technology in the year 1977 and received his Ph.D. in Computer and Communication Science from the University of Michigan in 1986, under the supervision of John Holland. However, much of his doctoral research was carried out in Carl Hewitt's Message-Passing Semantics Group at Massachusetts Institute of Technology (MIT). Agha's dissertation was published by the MIT Press as Actors: a model of concurrent computation in distributed systems, a book which, according to the ACM Guide to Computing Literature, has been cited over 3000 times. Agha was born and completed his early schooling in Sindh, Pakistan. He received his B.S. with honors from the California Institute of Technology in 1977.

Concurrent Collections is a programming model for software frameworks to expose parallelism in applications. The Concurrent Collections conception originated from tagged stream processing development with HP TStreams.

Discovery Net is one of the earliest examples of a scientific workflow system allowing users to coordinate the execution of remote services based on Web service and Grid Services standards. The system was designed and implemented at Imperial College London as part of the Discovery Net pilot project funded by the UK e-Science Programme. Many of the concepts pioneered by Discovery Net have been later incorporated into a variety of other scientific workflow systems.

Grasshopper 3D

Grasshopper is a visual programming language and environment developed by David Rutten at Robert McNeel & Associates, that runs within the Rhinoceros 3D computer-aided design (CAD) application. Programs are created by dragging components onto a canvas. The outputs to these components are then connected to the inputs of subsequent components.

Computational thermodynamics is the use of computers to simulate thermodynamic problems specific to materials science, particularly used in the construction of phase diagrams. Several open and commercial programs exist to perform these operations. The concept of the technique is minimization of Gibbs free energy of the system; the success of this method is due not only to properly measuring thermodynamic properties, such as those in the list of thermodynamic properties, but also due to the extrapolation of the properties of metastable allotropes of the chemical elements.

Edward A. Lee researcher ORCID id 0000-0002-5663-0584

Edward Ashford Lee is a Puerto-Rican-American computer scientist, electrical engineer, and author. He is Professor of the Graduate School and Robert S. Pepper Distinguished Professor Emeritus in the Electrical Engineering and Computer Science (EECS) Department at UC Berkeley. Lee works in the areas of cyber-physical systems, embedded systems, and the semantics of programming languages. He is particularly known for his advocacy of deterministic models for the engineering of cyber-physical systems.

References

  1. Eker, Johan; Janneck, Jorn; Lee, Edward A.; Liu, Jie; Liu, Xiaojun; Ludvig, Jozef; Sachs, Sonia; Xiong, Yuhong (January 2003). "Taming heterogeneity - the Ptolemy approach". Proceedings of the IEEE . 91 (1): 127–144. CiteSeerX   10.1.1.4.9905 . doi:10.1109/JPROC.2002.805829 . Retrieved 2011-02-04.