Friction of distance is a core principle of geography that states that movement incurs some form of cost, in the form of physical effort, energy, time, and/or the expenditure of other resources, and that these costs are proportional to the distance traveled. This cost is thus a resistance against movement, analogous (but not directly related) to the effect of friction against movement in classical mechanics. [1] The subsequent preference for minimizing distance and its cost underlies a vast array of geographic patterns from economic agglomeration to wildlife migration, as well as many of the theories and techniques of spatial analysis, such as Tobler's first law of geography, network routing, and cost distance analysis. To a large degree, friction of distance is the primary reason why geography is relevant to many aspects of the world, although its importance (and perhaps the importance of geography) has been decreasing with the development of transportation and communication technologies. [2] [3]
It is not known who first coined the term "friction of distance," but the effect of distance-based costs on geographic activity and geographic patterns has been a core element of academic geography since its initial rise in the 19th Century. von Thünen's isolated state model of exurban land use (1826), possibly the earliest geographic theory, directly incorporated the cost of transportation of different agricultural products as one of the determinants for how far from a town each type of goods could be produced profitably. [4] The industrial location theory of Alfred Weber (1909) and the central place theory of Walter Christaller (1933) [5] were also basically optimizations of space to minimize travel costs.
By the 1920s, social scientists began to incorporate principles of physics (more precisely, some of its mathematical formalizations), such as gravity, specifically the inverse square law found in Newton's law of universal gravitation. [6] Geographers quickly identified a number of situations in which the interaction between places, whether migration between cities or the distribution of residences willing to patronize a shop, exhibited this distance decay due to the advantages of minimizing distance traveled. Gravity models and other Distance optimization models became widespread during the quantitative revolution of the 1950s and the subsequent rise of spatial analysis. Gerald Carrothers (1956) was one of the first to explicitly use the analogy of "friction" to conceptualize the effect of distance, suggesting that these distance optimizations needed to acknowledge that the effect varies according to localized factors. [6] Ian McHarg, as published in Design with Nature (1969), was among those who developed the multifaceted nature of distance costs, although he did not initially employ mathematical or computational methods to optimize them. [7]
In the era of geographic information systems, starting in the 1970s, many of the existing proximity models and new algorithms were automated as analysis tools, making them significantly easier to use by a wider set of professionals. These tools have tended to focus on problems that could be solved deterministically, such as buffers, Cost distance analysis, interpolation and network routing. Other problems that apply the friction of distance are much more difficult (i.e., NP-hard), such as the traveling salesman problem and cluster analysis, and automated tools to solve them (usually using heuristic algorithms such as k-means clustering) are less widely available, or only recently available, in GIS software.
As an illustration, picture a hiker standing on the side of an isolated wooded mountain, who wishes to travel to the other side of the mountain. There are essentially an infinite number of paths she could take to get there. Traveling directly over the mountain peak is "expensive," in that every ten meters spent climbing requires significant effort. Traveling ten meters cross country through the woods requires significantly more time and effort than traveling ten meters along a developed trail or through open meadow. Taking a level route along a road going around the mountain has a much lower cost (in both effort and time) for every ten meters, but the total cost accumulates over a much longer distance. In each case, the amount of time and/or effort required to travel ten meters is a measurement of the friction of distance. Determining the optimal route requires balancing these costs, and can be solved using the technique of cost distance analysis.
In another, very common example, a person wants to drive from his home to the nearest hospital. Of the many (but finite) possible routes through the road network, the one with the shortest distance passes through residential neighborhoods with low speed limits and frequent stops. An alternative route follows a bypass highway around the neighborhoods, having a significantly longer distance, with much higher speed limits and infrequent stops. Thus, this alternative has a much lower unit friction of distance (in this case, time), but it accumulates over a greater distance, requiring calculations to determine the optimal (taking the least total travel time), perhaps using the network analysis algorithms commonly found in web maps such as Google Maps.
The costs that are proportional to distance can take a number of forms, each of which may or may not be relevant in a given geographic situation:
Some of these costs are easily quantifiable and measurable, such as transit time, fuel consumption, and construction costs, thus naturally lending themselves to optimization algorithms. That said, there may be a significant amount of uncertainty in predicting them due to variability over time (e.g., travel time through a road network depending on changing traffic volume) or variability in individual situations (e.g., how fast a person wishes to drive). Other costs are much more difficult to measure due to their qualitative or subjective nature, such as political protest or ecological impact; these typically require the creation of "pseudo-measures" in the form of indices or scales to operationalize. [3]
All of these costs are fields in that they are spatially intensive (a "density" of cost per unit distance) and vary over space. The cost field (often called a cost surface) may be a continuous, smooth function or may have abrupt changes. This variability of cost occurs both in unconstrained (two- or three-dimensional) space, as well as in constrained networks, such as roads and cable telecommunications.
A large number of geographic theories, spatial analysis techniques, and GIS applications are directly based on the practical effects of friction of distance:
Historically, the friction of distance was very high for most types of movement, making long-distance movement and interaction relatively slow and rare (but not non-existent). The result was a strongly localized human geography, manifested in aspects as varied as language and economy. One of the most profound effects of the technological advances since 1800, including the railroad, the automobile, and the telephone, has been to drastically reduce the costs of moving people, goods, and information over long distances. This led to widespread diffusion and integration, ultimately resulting in many of the aspects of globalization. [12] The geographic effect of this diminishing friction of distance is called time-space convergence or cost-space convergence. [13]
Of these technologies, telecommunications, especially the Internet, has perhaps had the most profound effect. Although there are still distance-based costs of transmitting information, such as the laying of cable and the generation of electromagnetic signal energy (traditionally manifesting in ways such as long-distance telephone charges), these are now so small for any meaningful unit of information that they are no longer managed in a distance-based form, but are bundled into fixed (not based on distance) service costs. [14] For example, some portion of the fee for mobile telephone service covers the higher costs of long-distance service, but the customer does not see it, and thus does not make communication decisions based on distance. The rise of free shipping has similar causes and effects on retail trade.
It has been argued that the virtual elimination of the friction of distance in many aspects of society has resulted in the "death of Geography," in which relative location is no longer relevant to many tasks in which it formerly played a crucial role. [15] It is now possible to conduct many interactions over global distances almost as easily as over local distances, including retail trade, business-to-business services, and some types of remote work. Thus, these services could be theoretically provided from anywhere with equal cost. The COVID-19 pandemic has tested and accelerated many of these trends. [16]
Conversely, others have seen a strengthening in the geographic effects of other aspects of life, or perhaps the increasing focus on them as traditional distance-based aspects have become less relevant. [17] This includes the lifestyle amenities of a place, such as local natural landscapes or urban nightlife that must be experienced in person (thus requiring physical travel and thus entailing the friction of distance). Also, many people prefer in-person interactions that could technically be conducted remotely, such as business meetings, education, tourism, and shopping, which should make distance-based effects relevant for the foreseeable future. [18] The contrasting trends of "frictional" and "frictionless" factors have necessitated a more nuanced analysis of geography than the traditional blanket statements of location always mattering, or the recent claims that location does not matter at all. [19]
Supervised learning (SL) is a paradigm in machine learning where input objects and a desired output value train a model. The training data is processed, building a function that maps new data on expected output values. An optimal scenario will allow for the algorithm to correctly determine output values for unseen instances. This requires the learning algorithm to generalize from the training data to unseen situations in a "reasonable" way. This statistical quality of an algorithm is measured through the so-called generalization error.
Regional science is a field of the social sciences concerned with analytical approaches to problems that are specifically urban, rural, or regional. Topics in regional science include, but are not limited to location theory or spatial economics, location modeling, transportation, migration analysis, land use and urban development, interindustry analysis, environmental and ecological analysis, resource management, urban and regional policy analysis, geographical information systems, and spatial data analysis. In the broadest sense, any social science analysis that has a spatial dimension is embraced by regional scientists.
Mathematical optimization or mathematical programming is the selection of a best element, with regard to some criterion, from some set of available alternatives. It is generally divided into two subfields: discrete optimization and continuous optimization. Optimization problems arise in all quantitative disciplines from computer science and engineering to operations research and economics, and the development of solution methods has been of interest in mathematics for centuries.
Economic geography is the subfield of human geography which studies economic activity and factors affecting them. It can also be considered a subfield or method in economics. There are four branches of economic geography.
Urban economics is broadly the economic study of urban areas; as such, it involves using the tools of economics to analyze urban issues such as crime, education, public transit, housing, and local government finance. More specifically, it is a branch of microeconomics that studies the urban spatial structure and the location of households and firms.
Alfred Weber was a German economist, geographer, sociologist and theoretician of culture whose work was influential in the development of modern economic geography.
Johann Heinrich von Thünen, sometimes spelled Thuenen, was a prominent nineteenth century economist and a native of Mecklenburg-Strelitz, now in northern Germany.
Central place theory is an urban geographical theory that seeks to explain the number, size and range of market services in a commercial system or human settlements in a residential system. It was introduced in 1933 to explain the spatial distribution of cities across the landscape. The theory was first analyzed by German geographer Walter Christaller, who asserted that settlements simply functioned as 'central places' providing economic services to surrounding areas. Christaller explained that a large number of small settlements will be situated relatively close to one another for efficiency, and because people don’t want to travel far for everyday needs, like getting bread from a bakery. But people would travel further for more expensive and infrequent purchases or specialized goods and services which would be located in larger settlements that are farther apart.
A transport network, or transportation network, is a network or graph in geographic space, describing an infrastructure that permits and constrains movement or flow. Examples include but are not limited to road networks, railways, air routes, pipelines, aqueducts, and power lines. The digital representation of these networks, and the methods for their analysis, is a core part of spatial analysis, geographic information systems, public utilities, and transport engineering. Network analysis is an application of the theories and algorithms of graph theory and is a form of proximity analysis.
A spatial network is a graph in which the vertices or edges are spatial elements associated with geometric objects, i.e., the nodes are located in a space equipped with a certain metric. The simplest mathematical realization of spatial network is a lattice or a random geometric graph, where nodes are distributed uniformly at random over a two-dimensional plane; a pair of nodes are connected if the Euclidean distance is smaller than a given neighborhood radius. Transportation and mobility networks, Internet, mobile phone networks, power grids, social and contact networks and biological neural networks are all examples where the underlying space is relevant and where the graph's topology alone does not contain all the information. Characterizing and understanding the structure, resilience and the evolution of spatial networks is crucial for many different fields ranging from urbanism to epidemiology.
The First Law of Geography, according to Waldo Tobler, is "everything is related to everything else, but near things are more related than distant things." This first law is the foundation of the fundamental concepts of spatial dependence and spatial autocorrelation and is utilized specifically for the inverse distance weighting method for spatial interpolation and to support the regionalized variable theory for kriging. The first law of geography is the fundamental assumption used in all spatial analysis.
Spatial analysis is any of the formal techniques which studies entities using their topological, geometric, or geographic properties. Spatial analysis includes a variety of techniques using different analytic approaches, especially spatial statistics. It may be applied in fields as diverse as astronomy, with its studies of the placement of galaxies in the cosmos, or to chip fabrication engineering, with its use of "place and route" algorithms to build complex wiring structures. In a more restricted sense, spatial analysis is geospatial analysis, the technique applied to structures at the human scale, most notably in the analysis of geographic data. It may also be applied to genomics, as in transcriptomics data.
Location theory has become an integral part of economic geography, regional science, and spatial economics. Location theory addresses questions of what economic activities are located where and why. Location theory or microeconomic theory generally assumes that agents act in their own self-interest. Firms thus choose locations that maximize their profits and individuals choose locations that maximize their utility.
Distance decay is a geographical term which describes the effect of distance on cultural or spatial interactions. The distance decay effect states that the interaction between two locales declines as the distance between them increases. Once the distance is outside of the two locales' activity space, their interactions begin to decrease. It is thus an assertion that the mathematics of the inverse square law in physics can be applied to many geographic phenomena, and is one of the ways in which physics principles such as gravity are often applied metaphorically to geographic situations.
The study of facility location problems (FLP), also known as location analysis, is a branch of operations research and computational geometry concerned with the optimal placement of facilities to minimize transportation costs while considering factors like avoiding placing hazardous materials near housing, and competitors' facilities. The techniques also apply to cluster analysis.
Hydrological optimization applies mathematical optimization techniques to water-related problems. These problems may be for surface water, groundwater, or the combination. The work is interdisciplinary, and may be done by hydrologists, civil engineers, environmental engineers, and operations researchers.
Proximity analysis is a class of spatial analysis tools and algorithms that employ geographic distance as a central principle. Distance is fundamental to geographic inquiry and spatial analysis, due to principles such as the friction of distance, Tobler's first law of geography, and Spatial autocorrelation, which are incorporated into analytical tools. Proximity methods are thus used in a variety of applications, especially those that involve movement and interaction.
CrimeStat is a crime mapping software program. CrimeStat is Windows-based program that conducts spatial and statistical analysis and is designed to interface with a geographic information system (GIS). The program is developed by Ned Levine & Associates under the direction of Ned Levine, with funding by the National Institute of Justice (NIJ), an agency of the United States Department of Justice. The program and manual are distributed for free by NIJ.
Geospatial topology is the study and application of qualitative spatial relationships between geographic features, or between representations of such features in geographic information, such as in geographic information systems (GIS). For example, the fact that two regions overlap or that one contains the other are examples of topological relationships. It is thus the application of the mathematics of topology to GIS, and is distinct from, but complementary to the many aspects of geographic information that are based on quantitative spatial measurements through coordinate geometry. Topology appears in many aspects of geographic information science and GIS practice, including the discovery of inherent relationships through spatial query, vector overlay and map algebra; the enforcement of expected relationships as validation rules stored in geospatial data; and the use of stored topological relationships in applications such as network analysis. Spatial topology is the generalization of geospatial topology for non-geographic domains, e.g., CAD software.
In spatial analysis and geographic information systems, cost distance analysis or cost path analysis is a method for determining one or more optimal routes of travel through unconstrained (two-dimensional) space. The optimal solution is that which minimizes the total cost of the route, based on a field of cost density that varies over space due to local factors. It is thus based on the fundamental geographic principle of Friction of distance. It is an optimization problem with multiple deterministic algorithm solutions, implemented in most GIS software.