Attack tolerance

Last updated

In the context of complex networks, attack tolerance is the network's robustness meaning the ability to maintain the overall connectivity and diameter of the network as nodes are removed. Several graph metrics have been proposed to predicate network robustness. Algebraic connectivity is a graph metric that shows the best graph robustness among them. [1]

Contents

Attack types

If an attack was to be mounted on a network, it would not be through random nodes but ones that were the most significant to the network. Different methods of ranking are utilized to determine the nodes priority in the network.

Average node degree

This form of attack prioritizes the most connected nodes as the most important ones. This takes into account the network (represented by graph ) changing over time, by analyzing the network as a series of snapshots (indexed by ); we denote the snapshot at time by . The average of the degree of a node, labeled , within a given snapshot , throughout a time interval (a sequence of snapshots) , is given by:

Node persistence

This form of attack prioritizes nodes that occur most frequently over a period of time. The equation below calculates the frequency that a node (i) occurs in a time interval . When the node is present during the snapshot then equation is equal to 1, but if the node is not present then it is equal to 0.

Where

Temporal closeness

This form of attack prioritizes nodes by the summation of temporal distances from one node to all other nodes over a period of time. The equation below calculates the temporal distance of a node (i) by averaging the sum of all the temporal distances for the interval [t1,tn]. [2]

Network model tolerances

Not all networks are the same, so it is no surprise that an attack on different networks would have different results. The common method for measuring change in the network is through the average of the size of all the isolated clusters, <s>, and the fraction of the nodes contained in the largest cluster, S. [3] When no nodes have been attacked, both S and <s> equal 1.

Erdős–Rényi model

In the ER model, the network generated is homogeneous, meaning each node has the same number of links. This is considered to be an exponential network. When comparing the connectivity of the ER model when it undergoes random failures vs directed attacks, we are shown that the exponential network reacts the same way to a random failure as it does to a directed attack. This is due to the homogeneity of the network, making it so that it does not matter whether a random node is selected or one is specifically targeted. All the nodes on average are the same in degree therefore attacking one shouldn't cause anymore damage than attacking another. As the number of attacks go up and more nodes are removed, we observe that S decreases non-linearly and acts as if a threshold exists when a fraction of the nodes (f) has been removed, (f≈.28). At this point, S goes to zero. The average size of the isolated clusters behaves opposite, increasing exponentially to <s>= 2, also approaching the threshold line f≈.28, except decreases back to 1 after. This model was tested for a large range of nodes and proven to maintain the same pattern. [3]

Scale-free model

In the scale-free model, the network is defined by its degree distribution following the power law, [4] which means that each node has no set number of links, unlike the exponential network. This makes the scale-free model more vulnerable because there are nodes that are more important than others, and if these nodes were to be deliberately attacked the network would break down. However this inhomogeneous network has its strengths when it comes to random failures. Due to the power law there are many more nodes in the system that have very few links, and probability estimates that these are the nodes that will be targeted (because there are more of them). Severing these smaller nodes will not affect the network as a whole and therefore allows the structure of the network to stay approximately the same. When the scale-free model undergoes random failures, S slowly decreases with no threshold-like behavior and <s> remains approximately 1. This indicates that the network is being broken apart one by one and not by large clusters. However, when the scale-free model undergoes deliberate attack the system behaves similarly to an exponential system, except it breaks down much quicker. As the number of attacks increases, S decreases with a threshold close to f=0.05, and <s> increases to the same threshold and then decreases back to one. The speed at which this type of network breaks down shows the vulnerability of common networks that are used everyday, such as the Internet. [5]

Related Research Articles

<span class="mw-page-title-main">Scale-free network</span> Network whose degree distribution follows a power law

A scale-free network is a network whose degree distribution follows a power law, at least asymptotically. That is, the fraction P(k) of nodes in the network having k connections to other nodes goes for large values of k as

<span class="mw-page-title-main">Random graph</span> Graph generated by a random process

In mathematics, random graph is the general term to refer to probability distributions over graphs. Random graphs may be described simply by a probability distribution, or by a random process which generates them. The theory of random graphs lies at the intersection between graph theory and probability theory. From a mathematical perspective, random graphs are used to answer questions about the properties of typical graphs. Its practical applications are found in all areas in which complex networks need to be modeled – many random graph models are thus known, mirroring the diverse types of complex networks encountered in different areas. In a mathematical context, random graph refers almost exclusively to the Erdős–Rényi random graph model. In other contexts, any graph model may be referred to as a random graph.

In graph theory, a clustering coefficient is a measure of the degree to which nodes in a graph tend to cluster together. Evidence suggests that in most real-world networks, and in particular social networks, nodes tend to create tightly knit groups characterised by a relatively high density of ties; this likelihood tends to be greater than the average probability of a tie randomly established between two nodes.

<span class="mw-page-title-main">Centrality</span> Degree of connectedness within a graph

In graph theory and network analysis, indicators of centrality assign numbers or rankings to nodes within a graph corresponding to their network position. Applications include identifying the most influential person(s) in a social network, key infrastructure nodes in the Internet or urban networks, super-spreaders of disease, and brain networks. Centrality concepts were first developed in social network analysis, and many of the terms used to measure centrality reflect their sociological origin.

<span class="mw-page-title-main">Degree distribution</span>

In the study of graphs and networks, the degree of a node in a network is the number of connections it has to other nodes and the degree distribution is the probability distribution of these degrees over the whole network.

<span class="mw-page-title-main">Barabási–Albert model</span> Scale-free network generation algorithm

The Barabási–Albert (BA) model is an algorithm for generating random scale-free networks using a preferential attachment mechanism. Several natural and human-made systems, including the Internet, the World Wide Web, citation networks, and some social networks are thought to be approximately scale-free and certainly contain few nodes with unusually high degree as compared to the other nodes of the network. The BA model tries to explain the existence of such nodes in real networks. The algorithm is named for its inventors Albert-László Barabási and Réka Albert.

<span class="mw-page-title-main">Watts–Strogatz model</span> Method of generating random small-world graphs

The Watts–Strogatz model is a random graph generation model that produces graphs with small-world properties, including short average path lengths and high clustering. It was proposed by Duncan J. Watts and Steven Strogatz in their article published in 1998 in the Nature scientific journal. The model also became known as the (Watts) beta model after Watts used to formulate it in his popular science book Six Degrees.

Mixing patterns refer to systematic tendencies of one type of nodes in a network to connect to another type. For instance, nodes might tend to link to others that are very similar or very different. This feature is common in many social networks, although it also appears sometimes in non-social networks. Mixing patterns are closely related to assortativity; however, for the purposes of this article, the term is used to refer to assortative or disassortative mixing based on real-world factors, either topological or sociological.

<span class="mw-page-title-main">Erdős–Rényi model</span> Two closely related models for generating random graphs

In the mathematical field of graph theory, the Erdős–Rényi model refers to one of two closely related models for generating random graphs or the evolution of a random network. These models are named after Hungarian mathematicians Paul Erdős and Alfréd Rényi, who introduced one of the models in 1959. Edgar Gilbert introduced the other model contemporaneously with and independently of Erdős and Rényi. In the model of Erdős and Rényi, all graphs on a fixed vertex set with a fixed number of edges are equally likely. In the model introduced by Gilbert, also called the Erdős–Rényi–Gilbert model, each edge has a fixed probability of being present or absent, independently of the other edges. These models can be used in the probabilistic method to prove the existence of graphs satisfying various properties, or to provide a rigorous definition of what it means for a property to hold for almost all graphs.

<span class="mw-page-title-main">Random geometric graph</span> In graph theory, the mathematically simplest spatial network

In graph theory, a random geometric graph (RGG) is the mathematically simplest spatial network, namely an undirected graph constructed by randomly placing N nodes in some metric space and connecting two nodes by a link if and only if their distance is in a given range, e.g. smaller than a certain neighborhood radius, r.

<span class="mw-page-title-main">Triadic closure</span>

Triadic closure is a concept in social network theory, first suggested by German sociologist Georg Simmel in his 1908 book Soziologie [Sociology: Investigations on the Forms of Sociation]. Triadic closure is the property among three nodes A, B, and C, that if the connections A-B and A-C exist, there is a tendency for the new connection B-C to be formed. Triadic closure can be used to understand and predict the growth of networks, although it is only one of many mechanisms by which new connections are formed in complex networks.

<span class="mw-page-title-main">Network science</span> Academic field

Network science is an academic field which studies complex networks such as telecommunication networks, computer networks, biological networks, cognitive and semantic networks, and social networks, considering distinct elements or actors represented by nodes and the connections between the elements or actors as links. The field draws on theories and methods including graph theory from mathematics, statistical mechanics from physics, data mining and information visualization from computer science, inferential modeling from statistics, and social structure from sociology. The United States National Research Council defines network science as "the study of network representations of physical, biological, and social phenomena leading to predictive models of these phenomena."

<span class="mw-page-title-main">Modularity (networks)</span> Measure of network community structure

Modularity is a measure of the structure of networks or graphs which measures the strength of division of a network into modules. Networks with high modularity have dense connections between the nodes within modules but sparse connections between nodes in different modules. Modularity is often used in optimization methods for detecting community structure in networks. Biological networks, including animal brains, exhibit a high degree of modularity. However, modularity maximization is not statistically consistent, and finds communities in its own null model, i.e. fully random graphs, and therefore it cannot be used to find statistically significant community structures in empirical networks. Furthermore, it has been shown that modularity suffers a resolution limit and, therefore, it is unable to detect small communities.

<span class="mw-page-title-main">Evolving network</span>

Evolving networks are networks that change as a function of time. They are a natural extension of network science since almost all real world networks evolve over time, either by adding or removing nodes or links over time. Often all of these processes occur simultaneously, such as in social networks where people make and lose friends over time, thereby creating and destroying edges, and some people become part of new social networks or leave their networks, changing the nodes in the network. Evolving network concepts build on established network theory and are now being introduced into studying networks in many diverse fields.

<span class="mw-page-title-main">Exponential family random graph models</span>

Exponential Random Graph Models (ERGMs) are a family of statistical models for analyzing data from social and other networks. Examples of networks examined using ERGM include knowledge networks, organizational networks, colleague networks, social media networks, networks of scientific development, and others.

Evolving networks are dynamic networks that change through time. In each period there are new nodes and edges that join the network while the old ones disappear. Such dynamic behaviour is characteristic for most real-world networks, regardless of their range - global or local. However, networks differ not only in their range but also in their topological structure. It is possible to distinguish:

Robustness, the ability to withstand failures and perturbations, is a critical attribute of many complex systems including complex networks.

In network science, a sparse network has much fewer links than the possible maximum number of links within that network. The study of sparse networks is a relatively new area primarily stimulated by the study of real networks, such as social and computer networks.

<span class="mw-page-title-main">Hub (network science)</span> Node with a number of links that greatly exceeds the average

In network science, a hub is a node with a number of links that greatly exceeds the average. Emergence of hubs is a consequence of a scale-free property of networks. While hubs cannot be observed in a random network, they are expected to emerge in scale-free networks. The uprise of hubs in scale-free networks is associated with power-law distribution. Hubs have a significant impact on the network topology. Hubs can be found in many real networks, such as the brain or the Internet.

<span class="mw-page-title-main">Configuration model</span>

In network science, the configuration model is a method for generating random networks from a given degree sequence. It is widely used as a reference model for real-life social networks, because it allows the modeler to incorporate arbitrary degree distributions.

References

  1. Alenazi, Mohammed; Sterbenz, James (2015). "Comprehensive comparison and accuracy of graph metrics in predicting network resilience". 2015 11th International Conference on the Design of Reliable Communication Networks (DRCN). pp. 157–164. doi:10.1109/DRCN.2015.7149007. ISBN   978-1-4799-7795-6. S2CID   14060719.
  2. Sur, Souvik; Ganguly, Niloy; Mukherjee, Animesh (2015). "Attack tolerance of correlated time-varying social networks with well-defined communities". Physica A: Statistical Mechanics and Its Applications. 420: 98–107. Bibcode:2015PhyA..420...98S. doi:10.1016/j.physa.2014.08.074.
  3. 1 2 Albert, Réka; Jeong, Hawoong; Barabási, Albert-László (2000). "The Internet's Achilles' Heel: Error and attack tolerance of complex networks". Nature. 406 (6794): 378–382. arXiv: cond-mat/0008064 . doi:10.1038/35019019. PMID   10935628. S2CID   1545338.
  4. BARABÁSI, ALBERT-LÁSZLÓ (2014). NETWORK SCIENCE.
  5. Sorokin, Alexey; Murphey, Robert; Thai, My; Pardalos, Panos (2012). Dynamics of Information Systems: Mathematical Foundations. Springer New York. ISBN   978-1-4614-3905-9.