Part of a series on | ||||
Network science | ||||
---|---|---|---|---|
Network types | ||||
Graphs | ||||
| ||||
Models | ||||
| ||||
| ||||
A Boolean network consists of a discrete set of Boolean variables each of which has a Boolean function (possibly different for each variable) assigned to it which takes inputs from a subset of those variables and output that determines the state of the variable it is assigned to. This set of functions in effect determines a topology (connectivity) on the set of variables, which then become nodes in a network. Usually, the dynamics of the system is taken as a discrete time series where the state of the entire network at time t+1 is determined by evaluating each variable's function on the state of the network at time t. This may be done synchronously or asynchronously. [1]
Boolean networks have been used in biology to model regulatory networks. Although Boolean networks are a crude simplification of genetic reality where genes are not simple binary switches, there are several cases where they correctly convey the correct pattern of expressed and suppressed genes. [2] [3] The seemingly mathematical easy (synchronous) model was only fully understood in the mid 2000s. [4]
A Boolean network is a particular kind of sequential dynamical system, where time and states are discrete, i.e. both the set of variables and the set of states in the time series each have a bijection onto an integer series.
A random Boolean network (RBN) is one that is randomly selected from the set of all possible Boolean networks of a particular size, N. One then can study statistically, how the expected properties of such networks depend on various statistical properties of the ensemble of all possible networks. For example, one may study how the RBN behavior changes as the average connectivity is changed.
The first Boolean networks were proposed by Stuart A. Kauffman in 1969, as random models of genetic regulatory networks [5] but their mathematical understanding only started in the 2000s. [6] [7]
Since a Boolean network has only 2N possible states, a trajectory will sooner or later reach a previously visited state, and thus, since the dynamics are deterministic, the trajectory will fall into a steady state or cycle called an attractor (though in the broader field of dynamical systems a cycle is only an attractor if perturbations from it lead back to it). If the attractor has only a single state it is called a point attractor, and if the attractor consists of more than one state it is called a cycle attractor. The set of states that lead to an attractor is called the basin of the attractor. States which occur only at the beginning of trajectories (no trajectories lead to them), are called garden-of-Eden states [8] and the dynamics of the network flow from these states towards attractors. The time it takes to reach an attractor is called transient time. [4]
With growing computer power and increasing understanding of the seemingly simple model, different authors gave different estimates for the mean number and length of the attractors, here a brief summary of key publications. [9]
Author | Year | Mean attractor length | Mean attractor number | comment |
---|---|---|---|---|
Kauffman [5] | 1969 | |||
Bastolla/ Parisi [10] | 1998 | faster than a power law, | faster than a power law, | first numerical evidences |
Bilke/ Sjunnesson [11] | 2002 | linear with system size, | ||
Socolar/Kauffman [12] | 2003 | faster than linear, with | ||
Samuelsson/Troein [13] | 2003 | superpolynomial growth, | mathematical proof | |
Mihaljev/Drossel [14] | 2005 | faster than a power law, | faster than a power law, |
In dynamical systems theory, the structure and length of the attractors of a network corresponds to the dynamic phase of the network. The stability of Boolean networks depends on the connections of their nodes. A Boolean network can exhibit stable, critical or chaotic behavior. This phenomenon is governed by a critical value of the average number of connections of nodes (), and can be characterized by the Hamming distance as distance measure. In the unstable regime, the distance between two initially close states on average grows exponentially in time, while in the stable regime it decreases exponentially. In this, with "initially close states" one means that the Hamming distance is small compared with the number of nodes () in the network.
For N-K-model [15] the network is stable if , critical if , and unstable if .
The state of a given node is updated according to its truth table, whose outputs are randomly populated. denotes the probability of assigning an off output to a given series of input signals.
If for every node, the transition between the stable and chaotic range depends on . According to Bernard Derrida and Yves Pomeau [16] , the critical value of the average number of connections is .
If is not constant, and there is no correlation between the in-degrees and out-degrees, the conditions of stability is determined by [17] [18] [19] The network is stable if , critical if , and unstable if .
The conditions of stability are the same in the case of networks with scale-free topology where the in-and out-degree distribution is a power-law distribution: , and , since every out-link from a node is an in-link to another. [20]
Sensitivity shows the probability that the output of the Boolean function of a given node changes if its input changes. For random Boolean networks, . In the general case, stability of the network is governed by the largest eigenvalue of matrix , where , and is the adjacency matrix of the network. [21] The network is stable if , critical if , unstable if .
One theme is to study different underlying graph topologies .
Classical Boolean networks (sometimes called CRBN, i.e. Classic Random Boolean Network) are synchronously updated. Motivated by the fact that genes don't usually change their state simultaneously, [24] different alternatives have been introduced. A common classification [25] is the following:
A scale-free network is a network whose degree distribution follows a power law, at least asymptotically. That is, the fraction P(k) of nodes in the network having k connections to other nodes goes for large values of k as
In the study of graphs and networks, the degree of a node in a network is the number of connections it has to other nodes and the degree distribution is the probability distribution of these degrees over the whole network.
In network theory, a giant component is a connected component of a given random graph that contains a significant fraction of the entire graph's vertices.
The Kuramoto model, first proposed by Yoshiki Kuramoto, is a mathematical model used in describing synchronization. More specifically, it is a model for the behavior of a large set of coupled oscillators. Its formulation was motivated by the behavior of systems of chemical and biological oscillators, and it has found widespread applications in areas such as neuroscience and oscillating flame dynamics. Kuramoto was quite surprised when the behavior of some physical systems, namely coupled arrays of Josephson junctions, followed his model.
The Barabási–Albert (BA) model is an algorithm for generating random scale-free networks using a preferential attachment mechanism. Several natural and human-made systems, including the Internet, the World Wide Web, citation networks, and some social networks are thought to be approximately scale-free and certainly contain few nodes with unusually high degree as compared to the other nodes of the network. The BA model tries to explain the existence of such nodes in real networks. The algorithm is named for its inventors Albert-László Barabási and Réka Albert.
Assortativity, or assortative mixing, is a preference for a network's nodes to attach to others that are similar in some way. Though the specific measure of similarity may vary, network theorists often examine assortativity in terms of a node's degree. The addition of this characteristic to network models more closely approximates the behaviors of many real world networks.
In network science, a gradient network is a directed subnetwork of an undirected "substrate" network where each node has an associated scalar potential and one out-link that points to the node with the smallest potential in its neighborhood, defined as the union of itself and its neighbors on the substrate network.
Fractal analysis is useful in the study of complex networks, present in both natural and artificial systems such as computer systems, brain and social networks, allowing further development of the field in network science.
In theoretical physics, quantum nonlocality refers to the phenomenon by which the measurement statistics of a multipartite quantum system do not allow an interpretation with local realism. Quantum nonlocality has been experimentally verified under a variety of physical assumptions. Any physical theory that aims at superseding or replacing quantum theory should account for such experiments and therefore cannot fulfill local realism; quantum nonlocality is a property of the universe that is independent of our description of nature.
The Ghirardi–Rimini–Weber theory (GRW) is a spontaneous collapse theory in quantum mechanics, proposed in 1986 by Giancarlo Ghirardi, Alberto Rimini, and Tullio Weber.
Network science is an academic field which studies complex networks such as telecommunication networks, computer networks, biological networks, cognitive and semantic networks, and social networks, considering distinct elements or actors represented by nodes and the connections between the elements or actors as links. The field draws on theories and methods including graph theory from mathematics, statistical mechanics from physics, data mining and information visualization from computer science, inferential modeling from statistics, and social structure from sociology. The United States National Research Council defines network science as "the study of network representations of physical, biological, and social phenomena leading to predictive models of these phenomena."
Quantum contextuality is a feature of the phenomenology of quantum mechanics whereby measurements of quantum observables cannot simply be thought of as revealing pre-existing values. Any attempt to do so in a realistic hidden-variable theory leads to values that are dependent upon the choice of the other (compatible) observables which are simultaneously measured. More formally, the measurement result of a quantum observable is dependent upon which other commuting observables are within the same measurement set.
Quantum complex networks are complex networks whose nodes are quantum computing devices. Quantum mechanics has been used to create secure quantum communications channels that are protected from hacking. Quantum communications offer the potential for secure enterprise-scale solutions.
Robustness, the ability to withstand failures and perturbations, is a critical attribute of many complex systems including complex networks.
In quantum mechanics, weak measurement is a type of quantum measurement that results in an observer obtaining very little information about the system on average, but also disturbs the state very little. From Busch's theorem any quantum system is necessarily disturbed by measurement, but the amount of disturbance is described by a parameter called the measurement strength.
In network science, the configuration model is a method for generating random networks from a given degree sequence. It is widely used as a reference model for real-life social networks, because it allows the modeler to incorporate arbitrary degree distributions.
The continuous spontaneous localization (CSL) model is a spontaneous collapse model in quantum mechanics, proposed in 1989 by Philip Pearle. and finalized in 1990 Gian Carlo Ghirardi, Philip Pearle and Alberto Rimini.
In statistical mechanics, Lee–Yang theory, sometimes also known as Yang–Lee theory, is a scientific theory which seeks to describe phase transitions in large physical systems in the thermodynamic limit based on the properties of small, finite-size systems. The theory revolves around the complex zeros of partition functions of finite-size systems and how these may reveal the existence of phase transitions in the thermodynamic limit.
In network science, the network entropy is a disorder measure derived from information theory to describe the level of randomness and the amount of information encoded in a graph. It is a relevant metric to quantitatively characterize real complex networks and can also be used to quantify network complexity
The random generalized Lotka–Volterra model (rGLV) is an ecological model and random set of coupled ordinary differential equations where the parameters of the generalized Lotka–Volterra equation are sampled from a probability distribution, analogously to quenched disorder. The rGLV models dynamics of a community of species in which each species' abundance grows towards a carrying capacity but is depleted due to competition from the presence of other species. It is often analyzed in the many-species limit using tools from statistical physics, in particular from spin glass theory.
{{cite book}}
: CS1 maint: location missing publisher (link){{cite book}}
: CS1 maint: location missing publisher (link)