In mathematics and theoretical computer science, a pattern is an unavoidable pattern if it is unavoidable on any finite alphabet.
Like a word, a pattern (also called term) is a sequence of symbols over some alphabet.
The minimum multiplicity of the pattern is where is the number of occurrence of symbol in pattern . In other words, it is the number of occurrences in of the least frequently occurring symbol in .
Given finite alphabets and , a word is an instance of the pattern if there exists a non-erasing semigroup morphism such that , where denotes the Kleene star of . Non-erasing means that for all , where denotes the empty string.
A word is said to match, or encounter, a pattern if a factor (also called subword or substring ) of is an instance of . Otherwise, is said to avoid , or to be -free. This definition can be generalized to the case of an infinite , based on a generalized definition of "substring".
A pattern is unavoidable on a finite alphabet if each sufficiently long word must match ; formally: if . Otherwise, is avoidable on , which implies there exist infinitely many words over the alphabet that avoid .
By Kőnig's lemma, pattern is avoidable on if and only if there exists an infinite word that avoids . [1]
Given a pattern and an alphabet . A -free word is a maximal -free word over if and match .
A pattern is an unavoidable pattern (also called blocking term) if is unavoidable on any finite alphabet.
If a pattern is unavoidable and not limited to a specific alphabet, then it is unavoidable for any finite alphabet by default. Conversely, if a pattern is said to be avoidable and not limited to a specific alphabet, then it is avoidable on some finite alphabet by default.
A pattern is -avoidable if is avoidable on an alphabet of size . Otherwise, is -unavoidable, which means is unavoidable on every alphabet of size . [2]
If pattern is -avoidable, then is -avoidable for all .
Given a finite set of avoidable patterns , there exists an infinite word such that avoids all patterns of . [1] Let denote the size of the minimal alphabet such that avoiding all patterns of .
The avoidability index of a pattern is the smallest such that is -avoidable, and if is unavoidable. [1]
Given alphabet , Zimin words (patterns) are defined recursively for and .
All Zimin words are unavoidable. [4]
A word is unavoidable if and only if it is a factor of a Zimin word. [4]
Given a finite alphabet , let represent the smallest such that matches for all . We have following properties: [5]
is the longest unavoidable pattern constructed by alphabet since .
Given a pattern over some alphabet , we say is free for if there exist subsets of such that the following hold:
For example, let , then is free for since there exist satisfying the conditions above.
A pattern reduces to pattern if there exists a symbol such that is free for , and can be obtained by removing all occurrence of from . Denote this relation by .
For example, let , then can reduce to since is free for .
A word is said to be locked if has no free letter; hence can not be reduced. [6]
Given patterns , if reduces to and reduces to , then reduces to . Denote this relation by .
A pattern is unavoidable if and only if reduces to a word of length one; hence such that and . [7] [4]
Given a simple graph , a edge coloring matches pattern if there exists a simple path in such that the sequence matches . Otherwise, is said to avoid or be -free.
Similarly, a vertex coloring matches pattern if there exists a simple path in such that the sequence matches .
The pattern chromatic number is the minimal number of distinct colors needed for a -free vertex coloring over the graph .
Let where is the set of all simple graphs with a maximum degree no more than .
Similarly, and are defined for edge colorings.
A pattern is avoidable on graphs if is bounded by , where only depends on .
There exists an absolute constant , such that for all patterns with . [8]
Given a pattern , let represent the number of distinct symbols of . If , then is avoidable on graphs.
Given a pattern such that is even for all , then for all , where is the complete graph of vertices. [8]
Given a pattern such that , and an arbitrary tree , let be the set of all avoidable subpatterns and their reflections of . Then . [8]
Given a pattern such that , and a tree with degree . Let be the set of all avoidable subpatterns and their reflections of , then . [8]
In the theory of computation, a branch of theoretical computer science, a pushdown automaton (PDA) is a type of automaton that employs a stack.
In mathematical physics and mathematics, the Pauli matrices are a set of three 2 × 2 complex matrices that are traceless, Hermitian, involutory and unitary. Usually indicated by the Greek letter sigma, they are occasionally denoted by tau when used in connection with isospin symmetries.
The uncertainty principle, also known as Heisenberg's indeterminacy principle, is a fundamental concept in quantum mechanics. It states that there is a limit to the precision with which certain pairs of physical properties, such as position and momentum, can be simultaneously known. In other words, the more accurately one property is measured, the less accurately the other property can be known.
In mathematics, a Gaussian function, often simply referred to as a Gaussian, is a function of the base form and with parametric extension for arbitrary real constants a, b and non-zero c. It is named after the mathematician Carl Friedrich Gauss. The graph of a Gaussian is a characteristic symmetric "bell curve" shape. The parameter a is the height of the curve's peak, b is the position of the center of the peak, and c controls the width of the "bell".
In particle physics, Fermi's interaction is an explanation of the beta decay, proposed by Enrico Fermi in 1933. The theory posits four fermions directly interacting with one another. This interaction explains beta decay of a neutron by direct coupling of a neutron with an electron, a neutrino and a proton.
In linear algebra, the generalized singular value decomposition (GSVD) is the name of two different techniques based on the singular value decomposition (SVD). The two versions differ because one version decomposes two matrices and the other version uses a set of constraints imposed on the left and right singular vectors of a single-matrix SVD.
A finite-state transducer (FST) is a finite-state machine with two memory tapes, following the terminology for Turing machines: an input tape and an output tape. This contrasts with an ordinary finite-state automaton, which has a single tape. An FST is a type of finite-state automaton (FSA) that maps between two sets of symbols. An FST is more general than an FSA. An FSA defines a formal language by defining a set of accepted strings, while an FST defines a relation between sets of strings.
In coding theory, block codes are a large and important family of error-correcting codes that encode data in blocks. There is a vast number of examples for block codes, many of which have a wide range of practical applications. The abstract definition of block codes is conceptually useful because it allows coding theorists, mathematicians, and computer scientists to study the limitations of all block codes in a unified way. Such limitations often take the form of bounds that relate different parameters of the block code to each other, such as its rate and its ability to detect and correct errors.
In general relativity, the Gibbons–Hawking–York boundary term is a term that needs to be added to the Einstein–Hilbert action when the underlying spacetime manifold has a boundary.
An electromagnetic reverberation chamber (also known as a reverb chamber (RVC) or mode-stirred chamber (MSC)) is an environment for electromagnetic compatibility (EMC) testing and other electromagnetic investigations. Electromagnetic reverberation chambers have been introduced first by H.A. Mendes in 1968. A reverberation chamber is screened room with a minimum of absorption of electromagnetic energy. Due to the low absorption, very high field strength can be achieved with moderate input power. A reverberation chamber is a cavity resonator with a high Q factor. Thus, the spatial distribution of the electrical and magnetic field strengths is strongly inhomogeneous (standing waves). To reduce this inhomogeneity, one or more tuners (stirrers) are used. A tuner is a construction with large metallic reflectors that can be moved to different orientations in order to achieve different boundary conditions. The Lowest Usable Frequency (LUF) of a reverberation chamber depends on the size of the chamber and the design of the tuner. Small chambers have a higher LUF than large chambers.
In condensed matter physics and crystallography, the static structure factor is a mathematical description of how a material scatters incident radiation. The structure factor is a critical tool in the interpretation of scattering patterns obtained in X-ray, electron and neutron diffraction experiments.
In 3-dimensional topology, a part of the mathematical field of geometric topology, the Casson invariant is an integer-valued invariant of oriented integral homology 3-spheres, introduced by Andrew Casson.
Expected shortfall (ES) is a risk measure—a concept used in the field of financial risk measurement to evaluate the market risk or credit risk of a portfolio. The "expected shortfall at q% level" is the expected return on the portfolio in the worst of cases. ES is an alternative to value at risk that is more sensitive to the shape of the tail of the loss distribution.
A ratio distribution is a probability distribution constructed as the distribution of the ratio of random variables having two other known distributions. Given two random variables X and Y, the distribution of the random variable Z that is formed as the ratio Z = X/Y is a ratio distribution.
Contact mechanics is the study of the deformation of solids that touch each other at one or more points. A central distinction in contact mechanics is between stresses acting perpendicular to the contacting bodies' surfaces and frictional stresses acting tangentially between the surfaces. Normal contact mechanics or frictionless contact mechanics focuses on normal stresses caused by applied normal forces and by the adhesion present on surfaces in close contact, even if they are clean and dry. Frictional contact mechanics emphasizes the effect of friction forces.
Experimental uncertainty analysis is a technique that analyses a derived quantity, based on the uncertainties in the experimentally measured quantities that are used in some form of mathematical relationship ("model") to calculate that derived quantity. The model used to convert the measurements into the derived quantity is usually based on fundamental principles of a science or engineering discipline.
In computer science, more specifically in automata and formal language theory, nested words are a concept proposed by Alur and Madhusudan as a joint generalization of words, as traditionally used for modelling linearly ordered structures, and of ordered unranked trees, as traditionally used for modelling hierarchical structures. Finite-state acceptors for nested words, so-called nested word automata, then give a more expressive generalization of finite automata on words. The linear encodings of languages accepted by finite nested word automata gives the class of visibly pushdown languages. The latter language class lies properly between the regular languages and the deterministic context-free languages. Since their introduction in 2004, these concepts have triggered much research in that area.
In set theory and mathematical logic, the Lévy hierarchy, introduced by Azriel Lévy in 1965, is a hierarchy of formulas in the formal language of the Zermelo–Fraenkel set theory, which is typically called just the language of set theory. This is analogous to the arithmetical hierarchy, which provides a similar classification for sentences of the language of arithmetic.
In string theory, the Ramond–Neveu–Schwarz (RNS) formalism is an approach to formulating superstrings in which the worldsheet has explicit superconformal invariance but spacetime supersymmetry is hidden, in contrast to the Green–Schwarz formalism where the latter is explicit. It was originally developed by Pierre Ramond, André Neveu and John Schwarz in the RNS model in 1971, which gives rise to type II string theories and can also give type I string theory. Heterotic string theories can also be acquired through this formalism by using a different worldsheet action. There are various ways to quantize the string within this framework including light-cone quantization, old canonical quantization, and BRST quantization. A consistent string theory is only acquired if the spectrum of states is restricted through a procedure known as a GSO projection, with this projection being automatically present in the Green–Schwarz formalism.
In quantum computing, Mølmer–Sørensen gate scheme refers to an implementation procedure for various multi-qubit quantum logic gates used mostly in trapped ion quantum computing. This procedure is based on the original proposition by Klaus Mølmer and Anders Sørensen in 1999-2000.