A segment of a system variable in computing shows a homogenous status of system dynamics over a time period. Here, a homogenous status of a variable is a state which can be described by a set of coefficients of a formula. For example, of homogenous statuses, we can bring status of constant ('ON' of a switch) and linear (60 miles or 96 km per hour for speed). Mathematically, a segment is a function mapping from a set of times which can be defined by a real interval, to the set [Zeigler76],[ZPK00], [Hwang13]. A trajectory of a system variable is a sequence of segments concatenated. We call a trajectory constant (respectively linear) if its concatenating segments are constant (respectively linear).
An event segment is a special class of the constant segment with a constraint in which the constant segment is either one of a timed event or a null-segment. The event segments are used to define Timed Event Systems such as DEVS, timed automata, and timed petri nets.
The time base of the concerning systems is denoted by , and defined
as the set of non-negative real numbers.
An event is a label that abstracts a change. Given an event set , the null event denoted by stands for nothing change.
A timed event is a pair where and denotes that an event occurs at time .
The null segment over time interval is denoted by which means nothing in occurs over .
A unit event segment is either a null event segment or a timed event.
Given an event set , concatenation of two unit event segments over and over is denoted by whose time interval is , and implies .
An event trajectory over an event set and a time interval is concatenation of unit event segments and where .
Mathematically, an event trajectory is a mapping a time period to an event set . So we can write it in a function form :
The universal timed language over an event set and a time interval , is the set of all event trajectories over and .
A timed language over an event set and a timed interval is a set of event trajectories over and if .
In mathematics, computable numbers are the real numbers that can be computed to within any desired precision by a finite, terminating algorithm. They are also known as the recursive numbers, effective numbers or the computable reals or recursive reals.
In probability theory and statistics, a probability distribution is the mathematical function that gives the probabilities of occurrence of different possible outcomes for an experiment. It is a mathematical description of a random phenomenon in terms of its sample space and the probabilities of events.
In mathematics, the Lucas–Lehmer test (LLT) is a primality test for Mersenne numbers. The test was originally developed by Édouard Lucas in 1876 and subsequently improved by Derrick Henry Lehmer in the 1930s.
In probability theory, the Borel–Kolmogorov paradox is a paradox relating to conditional probability with respect to an event of probability zero. It is named after Émile Borel and Andrey Kolmogorov.
The Arzelà–Ascoli theorem is a fundamental result of mathematical analysis giving necessary and sufficient conditions to decide whether every sequence of a given family of real-valued continuous functions defined on a closed and bounded interval has a uniformly convergent subsequence. The main condition is the equicontinuity of the family of functions. The theorem is the basis of many proofs in mathematics, including that of the Peano existence theorem in the theory of ordinary differential equations, Montel's theorem in complex analysis, and the Peter–Weyl theorem in harmonic analysis and various results concerning compactness of integral operators.
In econometrics, the autoregressive conditional heteroskedasticity (ARCH) model is a statistical model for time series data that describes the variance of the current error term or innovation as a function of the actual sizes of the previous time periods' error terms; often the variance is related to the squares of the previous innovations. The ARCH model is appropriate when the error variance in a time series follows an autoregressive (AR) model; if an autoregressive moving average (ARMA) model is assumed for the error variance, the model is a generalized autoregressive conditional heteroskedasticity (GARCH) model.
In mathematics, more specifically in dynamical systems, the method of averaging exploits systems containing time-scales separation: a fast oscillationversus a slow drift. It suggests that we perform an averaging over a given amount of time in order to iron out the fast oscillations and observe the qualitative behavior from the resulting dynamics. The approximated solution holds under finite time inversely proportional to the parameter denoting the slow time scale. It turns out to be a customary problem where there exists the trade off between how good is the approximated solution balanced by how much time it holds to be close to the original solution.
In mathematics, specifically in symplectic geometry, the symplectic cut is a geometric modification on symplectic manifolds. Its effect is to decompose a given manifold into two pieces. There is an inverse operation, the symplectic sum, that glues two manifolds together into one. The symplectic cut can also be viewed as a generalization of symplectic blow up. The cut was introduced in 1995 by Eugene Lerman, who used it to study the symplectic quotient and other operations on manifolds.
In mathematics, Bochner spaces are a generalization of the concept of spaces to functions whose values lie in a Banach space which is not necessarily the space or of real or complex numbers.
In mathematics, the Melnikov method is a tool to identify the existence of chaos in a class of dynamical systems under periodic perturbation.
The behavior of a given DEVS model is a set of sequences of timed events including null events, called event segments, which make the model move from one state to another within a set of legal states. To define it this way, the concept of a set of illegal state as well a set of legal states needs to be introduced.
The General System has been described in [Zeigler76] and [ZPK00] with the standpoints to define (1) the time base, (2) the admissible input segments, (3) the system states, (4) the state trajectory with an admissible input segment, (5) the output for an given state.
In theoretical computer science, DEVS is closed under coupling [Zeigper84] [ZPK00]. In other words, given a coupled DEVS model , its behavior is described as an atomic DEVS model . For a given coupled DEVS , once we have an equivalent atomic DEVS , behavior of can be referred to behavior of atomic DEVS which is based on Timed Event System.
In probability theory, conditional probability is a measure of the probability of an event occurring, given that another event has already occurred. This particular method relies on event B occurring with some sort of relationship with another event A. In this event, the event B can be analyzed by a conditionally probability with respect to A. If the event of interest is A and the event B is known or assumed to have occurred, "the conditional probability of A given B", or "the probability of A under the condition B", is usually written as P(A|B) or occasionally PB(A). This can also be understood as the fraction of probability B that intersects with A: .
The narrow escape problem is a ubiquitous problem in biology, biophysics and cellular biology.
In functional analysis, the Fréchet–Kolmogorov theorem gives a necessary and sufficient condition for a set of functions to be relatively compact in an Lp space. It can be thought of as an Lp version of the Arzelà–Ascoli theorem, from which it can be deduced. The theorem is named after Maurice René Fréchet and Andrey Kolmogorov.
Buchholz's psi-functions are a hierarchy of single-argument ordinal functions introduced by German mathematician Wilfried Buchholz in 1986. These functions are a simplified version of the -functions, but nevertheless have the same strength as those. Later on this approach was extended by Jaiger and Schütte.
In the theory of stochastic processes, a subdiscipline of probability theory, filtrations are totally ordered collections of subsets that are used to model the information that is available at a given point and therefore play an important role in the formalization of random processes.
Hybrid stochastic simulations are a sub-class of stochastic simulations. These simulations combine existing stochastic simulations with other stochastic simulations or algorithms. Generally they are used for physics and physics-related research. The goal of a hybrid stochastic simulation varies based on context, however they typically aim to either improve accuracy or reduce computational complexity. The first hybrid stochastic simulation was developed in 1985.
In mathematics, calculus on Euclidean space is a generalization of calculus of functions in one or several variables to calculus of functions on Euclidean space as well as a finite-dimensional real vector space. This calculus is also known as advanced calculus, especially in the United States. It is similar to multivariable calculus but is somehow more sophisticated in that it uses linear algebra more extensively and covers some concepts from differential geometry such as differential forms and Stokes' formula in terms of differential forms. This extensive use of linear algebra also allows a natural generalization of multivariable calculus to calculus on Banach spaces or topological vector spaces.