Network theory in risk assessment

Last updated

A network is an abstract structure capturing only the basics of connection patterns and little else. Because it is a generalized pattern, tools developed for analyzing, modeling and understanding networks can theoretically be implemented across disciplines. As long as a system can be represented by a network, there is an extensive set of tools – mathematical, computational, and statistical – that are well-developed and if understood can be applied to the analysis of the system of interest.

Contents

Figure 1: A bow-tie diagram of components in a directed network "Bow-tie" diagram of components in a directed network.svg
Figure 1: A bow-tie diagram of components in a directed network

Tools that are currently employed in risk assessment are often sufficient, but model complexity and limitations of computational power can tether risk assessors to involve more causal connections and account for more Black Swan event outcomes. By applying network theory tools to risk assessment, computational limitations may be overcome and result in broader coverage of events with a narrowed range of uncertainties. [1]

Decision-making processes are not incorporated into routine risk assessments; however, they play a critical role in such processes. [2] It is therefore very important for risk assessors to minimize confirmation bias by carrying out their analysis and publishing their results with minimal involvement of external factors such as politics, media, and advocates. In reality, however, it is nearly impossible to break the iron triangle among politicians, scientists (in this case, risk assessors), and advocates and media. [3] Risk assessors need to be sensitive to the difference between risk studies and risk perceptions. [4] [5] One way to bring the two closer is to provide decision-makers with data they can easily rely on and understand. Employing networks in the risk analysis process can visualize causal relationships and identify heavily-weighted or important contributors to the probability of the critical event. [6]

Bow-tie diagrams, cause-and-effect diagrams, Bayesian networks (a directed acyclic network) and fault trees are few examples of how network theories can be applied in risk assessment. [7]

In epidemiology risk assessments (Figure 7 and 9), once a network model was constructed, we can visually see then quantify and evaluate the potential exposure or infection risk of people related to the well-connected patients (Patient 1, 6, 35, 130 and 127 in Figure 7) or high-traffic places (Hotel M in Figure 9). In ecological risk assessments (Figure 8), through a network model we can identify the keystone species and determine how widespread the impacts will extend from the potential hazards being investigated.

Risk assessment key components

Figure 2: Risk Analysis, evaluation, assessment, and management Risk Analysis, evaluation, assesment, and management.jpg
Figure 2: Risk Analysis, evaluation, assessment, and management

Risk assessment is a method for dealing with uncertainty. For it to be beneficial to the overall risk management and decision making process, it must be able to capture extreme and catastrophic events. Risk assessment involves two parts: risk analysis and risk evaluation, although the term “risk assessment” can be seen used indistinguishable with “risk analysis”. In general, risk assessment can be divided into these steps: [8]

  1. Plan and prepare the risk analysis.
  2. Define and delimit the system and the scope of the analysis.
  3. Identify hazards and potential hazardous events.
  4. Determine causes and frequency of each hazardous event.
  5. Identify accident scenarios (i.e. even sequences) that may be initiated by each hazardous event.
  6. Select relevant and typical accident scenarios.
    Figure 3: Bow-tie diagram of risk management Bow-tie diagram.jpg
    Figure 3: Bow-tie diagram of risk management
  7. Determine the consequences of each accident scenario.
  8. Determine the frequency of each accident scenario.
  9. Assess the uncertainty.
  10. Establish and describe the risk picture.
  11. Report the analysis.
  12. Evaluate the risk against risk acceptance criteria
  13. Suggest and evaluate potential risk-reducing measures.

Naturally, the number of steps required varies with each assessment. It depends on the scope of the analysis and the complexity of the study object. [9] Because these is always varies degrees of uncertainty involved in any risk analysis process, sensitivity and uncertainty analysis are usually carried out to mitigate the level of uncertainty and therefore improve the overall risk assessment result.

Network theory key components

A network is a simplified representation that reduces a system to an abstract structure. Simply put, it is a collection of points linked together by lines. Each point is known as a “vertex ” (multiple: “vertices”) or “nodes”, and each line as “edges” or “links”. [10] Network modeling and studying have already been applied in many areas, including computer, physical, biological, ecological, logistical and social science. Through the studying of these models, we gain insights into the nature of individual components (i.e. vertices), connections or interactions between those components (i.e. edges), as well as the pattern of connections (i.e. network).

Undoubtedly, modifications of the structure (or pattern) of any given network can have a big effect on the behavior of the system it depicts. For example, connections in a social network affect how people communicate, exchange news, travel, and, less obviously, spread diseases. In order to gain better understanding of how each of these systems functions, some knowledge of the structure of the network is necessary.

Basic terminology

Small-World Effect

The small-world effect is one of the most remarkable network phenomena. It describes a finding that in many (perhaps most) networks the mean path distances between vertices are surprisingly small. [11] It has many implications in various areas of network studies. For instance, in social network, one can ruminate how fast a rumor (or a contagious disease) is spread in a community. From a mathematical point of view, since path lengths in networks are typically scale as log n (where n = number of network vertices), it is only logical it remains a small number even with large complex networks.
Another idea comes along with the small-world effect is called funneling. [12] It was derived from a social network experiment conducted by the experimental psychologist Stanley Milgram in the 1960s. In that experiment he concluded, along with the small-world effect phenomenon, that in any given social network, there were always few that were especially well connected. These few individuals were therefore responsible for the connection between any members and the rest of the world.

Degree, Hubs, and Paths

Figure 4: A small network with both multiedges and self-edges A small network with both multiedges and self-edges.svg
Figure 4: A small network with both multiedges and self-edges
Degree of a vertex is the number of edges connected to it. For example, on Figure 4, vertex 3 has a degree of five. Hubs are vertices in a network with a relatively higher degree. Vertex 3 again is a good example. In a social network, hubs can mean individuals with many acquaintances. In risk assessment, it can mean a hazardous event with multiple triggers (or the causal part of a bow-tie diagram). A path in a network is a route between a vertex and another across the network. From the same figure, an example of a path from vertex 1 to 6 can be 1→5→3→6.
Figure 5: A disconnected directed network with two components (shaded) Disconnected network.svg
Figure 5: A disconnected directed network with two components (shaded)

Centrality

Centrality is a measure of how important (or central) certain vertices are in a network. It can be measured by counting the number of edges connected to it (i.e its degree). The vertices with the highest degree therefore have a high degree centrality.
Degree centrality can have many implications. In a social network, a person with high degree centrality may have more influence over others, more access to information, or more opportunities than those with fewer connections. In a citation network, a paper with high degree centrality may suggest it is more influential and thus has a greater impact on its respective area of research. [13]
Figure 6: A connected directed network with two components (shaded) ConnectedNetwork.svg
Figure 6: A connected directed network with two components (shaded)
Eigenvector centrality is an extension of the concept of degree centrality, based on the fact that in many networks not all vertices have the same weight or importance. A vertex's importance in its network increases if it has more connections to important vertices. Eigenvector centrality, therefore, can be view as a centrality scoring system for not just one but its neighboring vertices as well.

Components

Subgroups, or subsets of vertices, in a disconnected network. Disconnected network means in such network, there is at least a pair of vertices that no path connecting between them at all. Vice verse is known as a connected network, where all vertices within are connected by at least one path. One can therefore say a connected network has only one component.

Directed Networks

Figure 7. An example of acyclic directed network in epidemiology by CDC. M218a1f2.gif
Figure 7. An example of acyclic directed network in epidemiology by CDC.
Networks of which each edge has a direction from one vertex to another. The edges are therefore known as directed edges. Example of such network include a link from the reference section on this page which will leads you to another, but not the other way around. In terms of food web, a prey eaten by a predator is another example.
Directed networks can be cyclic or acyclic. A cyclic directed network is one with a closed loop of edges. An acyclic directed network does not contain such loop. Since a self-edge – an edge connecting a vertex to itself – is considered a cycle, it is therefore absent from any acyclic network.
A Bayesian network is an example of an acyclic directed network.

Weighted Network

In reality, not all edges shares the same importance or weight (connections in a social network and keystone species in a food web, for example). A weighted network adds such element to its connections. It is widely used in genomic and systems biologic applications.

Trees

Undirected networks with no closed loops. A tree can be part of a network but isolated as a separate component. If all parts of a network are trees, such network is called a forest. An administrative body can sometime be viewed as a forest.

Other Examples of Network Theory Application

Social network

Early social network studies can be traced back to the end of the nineteenth century. However well-documented studies and foundation of this field are usually attributed to a psychiatrist named Jacob Moreno. He published a book entitled Who Whall Survive? in 1934 which laid out the foundation for sociometry (later known as social network analysis).

Another famous contributor to the early development of social network analysis is a perimental psychologist known as Stanley Milgram. His "small-world" experiments gave rise to concepts such as six degrees of separation and well-connected acquaintances (also known as "sociometric superstars"). This experiment was recently repeated by Dodds et al. by means of email messages, and the basic results were similar to Milgram's. The estimated true average path length (that is, the number of edges the email message has to pass from one unique individual to the intended targets in different countries) for the experiment was around five to seven, which is not much deviated from the original six degree of separation. [14]

Food web

Figure 8. East River Valley Trophic Web East River Valley Trophic Web.jpg
Figure 8. East River Valley Trophic Web

A food web, or food chain, is an example of directed network which describes the prey-predator relationship in a given ecosystem. Vertices in this type of network represent species, and the edges the prey-predator relationship. A collection of species may be represented by a single vertex if all members in that collection prey upon and are preyed on by the same organisms. A food web is often acyclic, with few exceptions such as adults preys on juveniles and parasitism. [15]

Note: In the food web main article, a food web was depicted as cyclic. That is based on the flow of the carbon and energy sources in a given ecosystem. The food web described here based solely on prey-predator roles; Organisms active in the carbon and nitrogen cycles (such as decomposers and fixers) are not considered in this description.

Epidemiology

Figure 9. Chain of transmission among guests at Hotel M -- Hong Kong, 2003 Chain of transmission among guests at Hotel M -- Hong Kong, 2003.jpg
Figure 9. Chain of transmission among guests at Hotel M -- Hong Kong, 2003

Epidemiology is closely related to social network. Contagious diseases can spread through connection networks such as work space, transportation, intimate body contacts and water system (see Figure 7 and 9). Though it only exists virtually, a computer viruses spread across internet networks are not much different from their physical counterparts. Therefore, understanding each of these network patterns can no doubt aid us in more precise prediction of the outcomes of epidemics and preparing better disease prevention protocols.

The simplest model of infection is presented as a SI (susceptible - infected) model. Most diseases, however, do not behave in such simple manner. Therefore, many modifications to this model were made such as the SIR (susceptible – infected – recovered), the SIS (the second S denotes reinfection) and SIRS models. The idea of latency is taken into accounts in models such as SEIR (where E stands for exposed). The SIR model is also known as the Reed-Frost model. [16]

To factor these into an outbreak network model, one must consider the degree distributions of vertices in the giant component of the network (outbreaks in small components are isolation and die out quickly, which does not allow the outbreaks to become epidemics). Theoretically, weighted network can provide more accurate information on exposure probability of vertices but more proofs are needed. Pastor-Satorras et al. pioneered much work in this area, which began with the simplest form (the SI model) and applied to networks drawn from the configuration model. [17]

The biology of how an infection causes disease in an individual is complicated and is another type of disease pattern specialists are interested in (a process known as pathogenesis which involves immunology of the host and virulence factors of the pathogen).

Notes

  1. Newman, Mark E. J. Networks: an Introduction. Oxford: Oxford UP, 2010. p.2
  2. National Research Council (NRC). Red Book Paradigm. Risk Assessment in the Federal Government: Understanding the Process. Washington D.C.: National Academy Press, 1983.
  3. Pielke Jr., Roger A. Policy, Politics and Perspective. Nature 416 (2002): 367-68.
  4. Slovic, Paul. Perception of Risk. Science 236 (1987): 280-85.
  5. National Research Council (NRC). Orange Book Paradigm. Understanding Risk: Informing Decisions in a Democratic Society. Washington D.C.: National Academy Press, 1996.
  6. Rausand, Marvin. Risk Assessment: Theory, Methods, and Applications. Hoboken, NJ: John Wiley & Sons, 2011. p.295.
  7. Rausand, Marvin. Risk Assessment: Theory, Methods, and Applications. Hoboken, NJ: John Wiley & Sons, 2011. p.266-302.
  8. Rausand, Marvin. "Chapter 5 Risk Management." Risk Assessment: Theory, Methods, and Applications. Hoboken, NJ: John Wiley & Sons, 2011. p.117-36.
  9. Rausand, Marvin. Risk Assessment: Theory, Methods, and Applications. Hoboken, NJ: John Wiley & Sons, 2011. p.124.
  10. Newman, Mark E. J. Networks: an Introduction. Oxford: Oxford UP, 2010. p.1
  11. Newman, Mark E. J. Networks: an Introduction. Oxford: Oxford UP, 2010. p.241
  12. Newman, Mark E. J. Networks: an Introduction. Oxford: Oxford UP, 2010. p.243
  13. Newman, Mark E. J. Networks: an Introduction. Oxford: Oxford UP, 2010. p.168
  14. Newman, Mark E. J. Networks: an Introduction. Oxford: Oxford UP, 2010. p.54-58
  15. Newman, Mark E. J. “Chapter 5.3 Ecological Networks”. Networks: an Introduction. Oxford: Oxford UP, 2010. p.99-104
  16. http://www.stat.columbia.edu/~regina/research/risk.pdf [ bare URL PDF ]
  17. Newman, Mark E. J. Networks: an Introduction. Oxford: Oxford UP, 2010. p.657-664

Related Research Articles

<span class="mw-page-title-main">Graph theory</span> Area of discrete mathematics

In mathematics, graph theory is the study of graphs, which are mathematical structures used to model pairwise relations between objects. A graph in this context is made up of vertices which are connected by edges. A distinction is made between undirected graphs, where edges link two vertices symmetrically, and directed graphs, where edges link two vertices asymmetrically. Graphs are one of the principal objects of study in discrete mathematics.

<span class="mw-page-title-main">Tree (graph theory)</span> Undirected, connected and acyclic graph

In graph theory, a tree is an undirected graph in which any two vertices are connected by exactly one path, or equivalently a connected acyclic undirected graph. A forest is an undirected graph in which any two vertices are connected by at most one path, or equivalently an acyclic undirected graph, or equivalently a disjoint union of trees.

<span class="mw-page-title-main">Directed acyclic graph</span> Directed graph with no directed cycles

In mathematics, particularly graph theory, and computer science, a directed acyclic graph (DAG) is a directed graph with no directed cycles. That is, it consists of vertices and edges, with each edge directed from one vertex to another, such that following those directions will never form a closed loop. A directed graph is a DAG if and only if it can be topologically ordered, by arranging the vertices as a linear ordering that is consistent with all edge directions. DAGs have numerous scientific and computational applications, ranging from biology to information science to computation (scheduling).

<span class="mw-page-title-main">Hypergraph</span> Generalization of graph theory

In mathematics, a hypergraph is a generalization of a graph in which an edge can join any number of vertices. In contrast, in an ordinary graph, an edge connects exactly two vertices.

In graph theory and computer science, an adjacency matrix is a square matrix used to represent a finite graph. The elements of the matrix indicate whether pairs of vertices are adjacent or not in the graph.

This is a glossary of graph theory. Graph theory is the study of graphs, systems of nodes or vertices connected in pairs by lines or edges.

<span class="mw-page-title-main">Graph (discrete mathematics)</span> Vertices connected in pairs by edges

In discrete mathematics, and more specifically in graph theory, a graph is a structure amounting to a set of objects in which some pairs of the objects are in some sense "related". The objects correspond to mathematical abstractions called vertices and each of the related pairs of vertices is called an edge. Typically, a graph is depicted in diagrammatic form as a set of dots or circles for the vertices, joined by lines or curves for the edges. Graphs are one of the objects of study in discrete mathematics.

<span class="mw-page-title-main">Random graph</span> Graph generated by a random process

In mathematics, random graph is the general term to refer to probability distributions over graphs. Random graphs may be described simply by a probability distribution, or by a random process which generates them. The theory of random graphs lies at the intersection between graph theory and probability theory. From a mathematical perspective, random graphs are used to answer questions about the properties of typical graphs. Its practical applications are found in all areas in which complex networks need to be modeled – many random graph models are thus known, mirroring the diverse types of complex networks encountered in different areas. In a mathematical context, random graph refers almost exclusively to the Erdős–Rényi random graph model. In other contexts, any graph model may be referred to as a random graph.

<span class="mw-page-title-main">Strongly connected component</span> Partition of a graph whose components are reachable from all vertices

In the mathematical theory of directed graphs, a graph is said to be strongly connected if every vertex is reachable from every other vertex. The strongly connected components of an arbitrary directed graph form a partition into subgraphs that are themselves strongly connected. It is possible to test the strong connectivity of a graph, or to find its strongly connected components, in linear time (that is, Θ(V + E)).

<span class="mw-page-title-main">Centrality</span> Degree of connectedness within a graph

In graph theory and network analysis, indicators of centrality assign numbers or rankings to nodes within a graph corresponding to their network position. Applications include identifying the most influential person(s) in a social network, key infrastructure nodes in the Internet or urban networks, super-spreaders of disease, and brain networks. Centrality concepts were first developed in social network analysis, and many of the terms used to measure centrality reflect their sociological origin.

<span class="mw-page-title-main">Complex network</span> Network with non-trivial topological features

In the context of network theory, a complex network is a graph (network) with non-trivial topological features—features that do not occur in simple networks such as lattices or random graphs but often occur in networks representing real systems. The study of complex networks is a young and active area of scientific research inspired largely by empirical findings of real-world networks such as computer networks, biological networks, technological networks, brain networks, climate networks and social networks.

<span class="mw-page-title-main">Dual graph</span> Graph representing faces of another graph

In the mathematical discipline of graph theory, the dual graph of a plane graph G is a graph that has a vertex for each face of G. The dual graph has an edge for each pair of faces in G that are separated from each other by an edge, and a self-loop when the same face appears on both sides of an edge. Thus, each edge e of G has a corresponding dual edge, whose endpoints are the dual vertices corresponding to the faces on either side of e. The definition of the dual depends on the choice of embedding of the graph G, so it is a property of plane graphs rather than planar graphs. For planar graphs generally, there may be multiple dual graphs, depending on the choice of planar embedding of the graph.

In the mathematical field of graph theory, a transitive reduction of a directed graph D is another directed graph with the same vertices and as few edges as possible, such that for all pairs of vertices v, w a (directed) path from v to w in D exists if and only if such a path exists in the reduction. Transitive reductions were introduced by Aho, Garey & Ullman (1972), who provided tight bounds on the computational complexity of constructing them.

<span class="mw-page-title-main">Edge contraction</span> Deleting a graph edge and merging its nodes

In graph theory, an edge contraction is an operation that removes an edge from a graph while simultaneously merging the two vertices that it previously joined. Edge contraction is a fundamental operation in the theory of graph minors. Vertex identification is a less restrictive form of this operation.

<span class="mw-page-title-main">Erdős–Rényi model</span> Two closely related models for generating random graphs

In the mathematical field of graph theory, the Erdős–Rényi model refers to one of two closely related models for generating random graphs or the evolution of a random network. These models are named after Hungarian mathematicians Paul Erdős and Alfréd Rényi, who introduced one of the models in 1959. Edgar Gilbert introduced the other model contemporaneously with and independently of Erdős and Rényi. In the model of Erdős and Rényi, all graphs on a fixed vertex set with a fixed number of edges are equally likely. In the model introduced by Gilbert, also called the Erdős–Rényi–Gilbert model, each edge has a fixed probability of being present or absent, independently of the other edges. These models can be used in the probabilistic method to prove the existence of graphs satisfying various properties, or to provide a rigorous definition of what it means for a property to hold for almost all graphs.

<span class="mw-page-title-main">Network science</span> Academic field

Network science is an academic field which studies complex networks such as telecommunication networks, computer networks, biological networks, cognitive and semantic networks, and social networks, considering distinct elements or actors represented by nodes and the connections between the elements or actors as links. The field draws on theories and methods including graph theory from mathematics, statistical mechanics from physics, data mining and information visualization from computer science, inferential modeling from statistics, and social structure from sociology. The United States National Research Council defines network science as "the study of network representations of physical, biological, and social phenomena leading to predictive models of these phenomena."

<span class="mw-page-title-main">Directed graph</span> Graph with oriented edges

In mathematics, and more specifically in graph theory, a directed graph is a graph that is made up of a set of vertices connected by directed edges, often called arcs.

<span class="mw-page-title-main">Degeneracy (graph theory)</span> Measurement of graph sparsity

In graph theory, a k-degenerate graph is an undirected graph in which every subgraph has a vertex of degree at most k: that is, some vertex in the subgraph touches k or fewer of the subgraph's edges. The degeneracy of a graph is the smallest value of k for which it is k-degenerate. The degeneracy of a graph is a measure of how sparse it is, and is within a constant factor of other sparsity measures such as the arboricity of a graph.

<span class="mw-page-title-main">Betweenness centrality</span> Measure of a graphs centrality, based on shortest paths

In graph theory, betweenness centrality is a measure of centrality in a graph based on shortest paths. For every pair of vertices in a connected graph, there exists at least one shortest path between the vertices such that either the number of edges that the path passes through or the sum of the weights of the edges is minimized. The betweenness centrality for each vertex is the number of these shortest paths that pass through the vertex.

In graph theory, an orientation of an undirected graph is an assignment of a direction to each edge, turning the initial graph into a directed graph.

References