Metcalfe's law states that the financial value or influence of a telecommunications network is proportional to the square of the number of connected users of the system (n2). The law is named after Robert Metcalfe and was first proposed in 1980, albeit not in terms of users, but rather of "compatible communicating devices" (e.g., fax machines, telephones). [1] It later became associated with users on the Ethernet after a September 1993 Forbes article by George Gilder. [2]
Metcalfe's law characterizes many of the network effects of communication technologies and networks such as the Internet, social networking and the World Wide Web. Former Chairman of the U.S. Federal Communications Commission Reed Hundt said that this law gives the most understanding to the workings of the present-day Internet. [3] Mathematically, Metcalfe's Law shows that the number of unique possible connections in an -node connection can be expressed as the triangular number , which is asymptotically proportional to .
The law has often been illustrated using the example of fax machines: a single fax machine on its own is useless, but the value of every fax machine increases with the total number of fax machines in the network, because the total number of people with whom each user may send and receive documents increases. [4] This is common illustration to explain Network effect. Thus, in any social networks, the greater the number of users with the service, the more valuable the service becomes to the community.
Metcalfe's law was conceived in 1983 in a presentation to the 3Com sales force. [5] It stated V would be proportional to the total number of possible connections, or approximately n-squared.
The original incarnation was careful to delineate between a linear cost (Cn), non-linear growth(n2) and a non-constant proportionality factor affinity (A). The break-even point point where costs are recouped is given by:At some size, the right-hand side of the equation V, value, exceeds the cost, and A describes the relationship between size and net value added. For large n, net network value is then:Metcalfe properly dimensioned A as "value per user". Affinity is also a function of network size, and Metcalfe correctly asserted that A must decline as n grows large. In a 2006 interview, Metcalfe stated: [6]
There may be diseconomies of network scale that eventually drive values down with increasing size. So, if V=A*n2, it could be that A (for “affinity,” value per connection) is also a function of n and heads down after some network size, overwhelming n2.
Network size, and hence value, does not grow unbounded but is constrained by practical limitations such as infrastructure, access to technology, and bounded rationality such as Dunbar's number. It is almost always the case that user growth n reaches a saturation point. With technologies, substitutes, competitors and technical obsolescence constrain growth of n. Growth of n is typically assumed to follow a sigmoid function such as a logistic curve or Gompertz curve.
A is also governed by the connectivity or density of the network topology. In an undirected network, every edge connects two nodes such that there are 2m nodes per edge. The proportion of nodes in actual contact are given by .
The maximum possible number of edges in a simple network (i.e. one with no multi-edges or self-edges) is . Therefore the density ρ of a network is the faction of those edges that are actually present is:
which for large networks is approximated by . [7]
Metcalfe's law assumes that the value of each node is of equal benefit. [3] If this is not the case, for example because one fax machine serves 60 workers in a company, the second fax machine serves half of that, the third one third, and so on, then the relative value of an additional connection decreases. Likewise, in social networks, if users that join later use the network less than early adopters, then the benefit of each additional user may lessen, making the overall network less efficient if costs per users are fixed.
Within the context of social networks, many, including Metcalfe himself, have proposed modified models in which the value of the network grows as rather than . [8] [3] Reed[ non sequitur ] and Andrew Odlyzko have sought out possible relationships to Metcalfe's Law in terms of describing the relationship of a network and one can read about how those are related. Tongia and Wilson also examine the related question of the costs to those excluded. [9]
For more than 30 years, there was little concrete evidence in support of the law. Finally, in July 2013, Dutch researchers analyzed European Internet-usage patterns over a long-enough time[ specify ] and found proportionality for small values of and proportionality for large values of . [10] A few months later, Metcalfe himself provided further proof by using Facebook's data over the past 10 years to show a good fit for Metcalfe's law. [11]
In 2015, Zhang, Liu, and Xu parameterized the Metcalfe function in data from Tencent and Facebook. Their work showed that Metcalfe's law held for both, despite differences in audience between the two sites (Facebook serving a worldwide audience and Tencent serving only Chinese users). The functions for the two sites were and respectively. [12] One of the earliest mentions of the Metcalfe Law in the context of Bitcoin was by a Reddit post by Santostasi in 2014. He compared the observed generalized Metcalfe behavior for Bitcoin to the Zipf's Law and the theoretical Metcalfe result. [13] The Metcalfe's Law is a critical component of Santostasi's Bitcoin Power Law Theory. [14] In a working paper, Peterson linked time-value-of-money concepts to Metcalfe value using Bitcoin and Facebook as numerical examples of the proof, [15] and in 2018 applied Metcalfe's law to Bitcoin, showing that over 70% of variance in Bitcoin value was explained by applying Metcalfe's law to increases in Bitcoin network size. [16]
In a 2024 interview, mathematician Terrence Tao emphasized the importance of universality and networking within the mathematics community, for which he cited the Metcalfe's Law. Tao believes that a larger audience leads to more connections, which ultimately results in positive developments within the community. For this, he cited Metcalfe's law to support this perspective. Tao further stated, "my whole career experience has been sort of the more connections equals just better stuff happening". [17]
In economics, a network effect is the phenomenon by which the value or utility a user derives from a good or service depends on the number of users of compatible products. Network effects are typically positive feedback systems, resulting in users deriving more and more value from a product as more users join the same network. The adoption of a product by an additional user can be broken into two effects: an increase in the value to all other users and also the enhancement of other non-users' motivation for using the product.
Network topology is the arrangement of the elements of a communication network. Network topology can be used to define or describe the arrangement of various types of telecommunication networks, including command and control radio networks, industrial fieldbusses and computer networks.
A scale-free network is a network whose degree distribution follows a power law, at least asymptotically. That is, the fraction P(k) of nodes in the network having k connections to other nodes goes for large values of k as
In graph theory, a clustering coefficient is a measure of the degree to which nodes in a graph tend to cluster together. Evidence suggests that in most real-world networks, and in particular social networks, nodes tend to create tightly knit groups characterised by a relatively high density of ties; this likelihood tends to be greater than the average probability of a tie randomly established between two nodes.
In graph theory and network analysis, indicators of centrality assign numbers or rankings to nodes within a graph corresponding to their network position. Applications include identifying the most influential person(s) in a social network, key infrastructure nodes in the Internet or urban networks, super-spreaders of disease, and brain networks. Centrality concepts were first developed in social network analysis, and many of the terms used to measure centrality reflect their sociological origin.
In the study of graphs and networks, the degree of a node in a network is the number of connections it has to other nodes and the degree distribution is the probability distribution of these degrees over the whole network.
A computer network is a set of computers sharing resources located on or provided by network nodes. Computers use common communication protocols over digital interconnections to communicate with each other. These interconnections are made up of telecommunication network technologies based on physically wired, optical, and wireless radio-frequency methods that may be arranged in a variety of network topologies.
The Barabási–Albert (BA) model is an algorithm for generating random scale-free networks using a preferential attachment mechanism. Several natural and human-made systems, including the Internet, the World Wide Web, citation networks, and some social networks are thought to be approximately scale-free and certainly contain few nodes with unusually high degree as compared to the other nodes of the network. The BA model tries to explain the existence of such nodes in real networks. The algorithm is named for its inventors Albert-László Barabási and Réka Albert.
The Watts–Strogatz model is a random graph generation model that produces graphs with small-world properties, including short average path lengths and high clustering. It was proposed by Duncan J. Watts and Steven Strogatz in their article published in 1998 in the Nature scientific journal. The model also became known as the (Watts) beta model after Watts used to formulate it in his popular science book Six Degrees.
Assortativity, or assortative mixing, is a preference for a network's nodes to attach to others that are similar in some way. Though the specific measure of similarity may vary, network theorists often examine assortativity in terms of a node's degree. The addition of this characteristic to network models more closely approximates the behaviors of many real world networks.
In the mathematical field of graph theory, the Erdős–Rényi model refers to one of two closely related models for generating random graphs or the evolution of a random network. These models are named after Hungarian mathematicians Paul Erdős and Alfréd Rényi, who introduced one of the models in 1959. Edgar Gilbert introduced the other model contemporaneously with and independently of Erdős and Rényi. In the model of Erdős and Rényi, all graphs on a fixed vertex set with a fixed number of edges are equally likely. In the model introduced by Gilbert, also called the Erdős–Rényi–Gilbert model, each edge has a fixed probability of being present or absent, independently of the other edges. These models can be used in the probabilistic method to prove the existence of graphs satisfying various properties, or to provide a rigorous definition of what it means for a property to hold for almost all graphs.
Triadic closure is a concept in social network theory, first suggested by German sociologist Georg Simmel in his 1908 book Soziologie [Sociology: Investigations on the Forms of Sociation]. Triadic closure is the property among three nodes A, B, and C, that if the connections A-B and A-C exist, there is a tendency for the new connection B-C to be formed. Triadic closure can be used to understand and predict the growth of networks, although it is only one of many mechanisms by which new connections are formed in complex networks.
Network science is an academic field which studies complex networks such as telecommunication networks, computer networks, biological networks, cognitive and semantic networks, and social networks, considering distinct elements or actors represented by nodes and the connections between the elements or actors as links. The field draws on theories and methods including graph theory from mathematics, statistical mechanics from physics, data mining and information visualization from computer science, inferential modeling from statistics, and social structure from sociology. The United States National Research Council defines network science as "the study of network representations of physical, biological, and social phenomena leading to predictive models of these phenomena."
Modularity is a measure of the structure of networks or graphs which measures the strength of division of a network into modules. Networks with high modularity have dense connections between the nodes within modules but sparse connections between nodes in different modules. Modularity is often used in optimization methods for detecting community structure in networks. Biological networks, including animal brains, exhibit a high degree of modularity. However, modularity maximization is not statistically consistent, and finds communities in its own null model, i.e. fully random graphs, and therefore it cannot be used to find statistically significant community structures in empirical networks. Furthermore, it has been shown that modularity suffers a resolution limit and, therefore, it is unable to detect small communities.
In economics, Beckstrom's law is a model or theorem formulated by Rod Beckstrom. It purports to answer "the decades-old question of 'how valuable is a network'", and states in summary that "The value of a network equals the net value added to each user’s transactions conducted through that network, summed over all users."
In graph theory, the Katz centrality or alpha centrality of a node is a measure of centrality in a network. It was introduced by Leo Katz in 1953 and is used to measure the relative degree of influence of an actor within a social network. Unlike typical centrality measures which consider only the shortest path between a pair of actors, Katz centrality measures influence by taking into account the total number of walks between a pair of actors.
In graph theory, betweenness centrality is a measure of centrality in a graph based on shortest paths. For every pair of vertices in a connected graph, there exists at least one shortest path between the vertices such that either the number of edges that the path passes through or the sum of the weights of the edges is minimized. The betweenness centrality for each vertex is the number of these shortest paths that pass through the vertex.
A hyperbolic geometric graph (HGG) or hyperbolic geometric network (HGN) is a special type of spatial network where (1) latent coordinates of nodes are sprinkled according to a probability density function into a hyperbolic space of constant negative curvature and (2) an edge between two nodes is present if they are close according to a function of the metric (typically either a Heaviside step function resulting in deterministic connections between vertices closer than a certain threshold distance, or a decaying function of hyperbolic distance yielding the connection probability). A HGG generalizes a random geometric graph (RGG) whose embedding space is Euclidean.
In network science, a hub is a node with a number of links that greatly exceeds the average. Emergence of hubs is a consequence of a scale-free property of networks. While hubs cannot be observed in a random network, they are expected to emerge in scale-free networks. The uprise of hubs in scale-free networks is associated with power-law distribution. Hubs have a significant impact on the network topology. Hubs can be found in many real networks, such as the brain or the Internet.
In network science, the configuration model is a method for generating random networks from a given degree sequence. It is widely used as a reference model for real-life social networks, because it allows the modeler to incorporate arbitrary degree distributions.