The First Law of Geography, according to Waldo Tobler, is "everything is related to everything else, but near things are more related than distant things." [1] This first law is the foundation of the fundamental concepts of spatial dependence and spatial autocorrelation and is utilized specifically for the inverse distance weighting method for spatial interpolation and to support the regionalized variable theory for kriging. [2] The first law of geography is the fundamental assumption used in all spatial analysis. [3]
Tobler first presented his seminal idea during a meeting of the International Geographical Union's Commission on Qualitative Methods held in 1969 and later published by him in 1970 in his publication "A Computer Movie Simulating Urban Growth in the Detroit Region". [1] In this paper Tobler created a model of the population growth in Detroit, and was discussing variables included within the model, and Tobler was probably not extremely serious when he originally invoked the first law and instead was explaining limitations brought about by computers of the 1970s. [1] [3] He certainly did not think it would be as prominent in geography as it is today. [3] Though simple in its presentation, this idea is profound. Without it, "the full range of conditions anywhere on the Earth's surface could be packed within any small area. There would be no regions of approximately homogeneous conditions to be described by giving attributes to area objects. Topographic surfaces would vary chaotically, with infinite slopes everywhere, and the contours of such surfaces would be infinitely dense and contorted. Spatial analysis, and indeed life itself, would be impossible." [4]
While Tobler is the first to present the concept as the first law of geography, it existed in some form as a concept before him. In 1935, R.A. Fisher said "the widely verified fact that patches in close proximity are commonly more alike, as judged by the yield of crops, than those which are further apart." [5] [6] Tobler was made aware of this by a peer-reviewer, and seems to have come up with the first law independently. [5]
Tobler's law was proposed towards the end of the quantitative revolution in geography, which saw a shift towards using systematic and scientific methods in geography. This paradigm shifted the discipline from idiographic geography to an empirical law-making nomothetic geography. [7] [8] This law-making approach was conducive to the acceptance of Tobler's law, and Tobler's law can be seen as a direct product of the quantitative revolution. [9]
In 2003, the American Association of Geographers held a panel titled "On Tobler's First Law of Geography," with panelists selected to represent diverse geographic interests and their philosophical perspective on Tobler's First Law. [10] In 2004, the peer reviewed journal Annals of the Association of American Geographers included a section titled "Methods, Models, and GIS Forum: On Tobler's First Law of Geography" that contained several peer-reviewed papers from members of the 2003 panel. [11] Of note, this section also had a paper by Tobler titled "On the First Law of Geography: A Reply," which contained his response to the 2003 panel and insight into the first law. [5] In this publication, Tobler discussed his less well known second law, which complements the first:
"The phenomenon external to an area of interest affects what goes on inside."
— Waldo Tobler
The theory is based upon the concept of the friction of distance "where distance itself hinders interaction between places. The farther two places are apart, the greater the hindrance", [12] or cost. For example, one is less likely to travel across town to purchase a sandwich than walk to the corner store for the same sandwich. In this example, hindrance, or cost, can readily be counted in time (amount of time as well as the value of time), transportation costs, and personal muscle energy loss which are added to the purchase price and thus result in high levels of friction. The friction of distance and the increase in cost combine, causing the distance decay effect.
Some have disputed the usefulness and validity of Tobler's first law. [5] [13] In general, some also dispute the entire concept of laws in geography and the social sciences. These criticisms have been addressed by Tobler and others. [5]
An anonymous reviewer pointed out that Tobler's first law is remarkably close to a phrase in a book by R.A. Fisher in 1935. [5] [6] Tobler seems to have come up with the first law independently. [5]
Some view Tobler's first law to be limited and have proposed amendments. One example of such an amendment proposed by Robert T. Walker combines Tobler's first law with von Thünen's concept of accessibility to offer an explanation for the description provided by Tobler. [9] The resulting law, referred to by Walker as "The Tobler-von Thünen law," is:
″Everything is related to everything else, but near things are more related than distant things, as a consequence of accessibility."
— Robert T. Walker, [9]
A cartogram is a thematic map of a set of features, in which their geographic size is altered to be directly proportional to a selected variable, such as travel time, population, or Gross National Product. Geographic space itself is thus warped, sometimes extremely, in order to visualize the distribution of the variable. It is one of the most abstract types of map; in fact, some forms may more properly be called diagrams. They are primarily used to display emphasis and for analysis as nomographs.
Waldo Rudolph Tobler was an American-Swiss geographer and cartographer. Tobler is regarded as one of the most influential geographers and cartographers of the late 20th century and early 21st century. He is most well known for coining what has come to be referred to as Tobler's first law of geography. He also coined what has come to be referred to as Tobler's second law of geography.
The quantitative revolution (QR) was a paradigm shift that sought to develop a more rigorous and systematic methodology for the discipline of geography. It came as a response to the inadequacy of regional geography to explain general spatial dynamics. The main claim for the quantitative revolution is that it led to a shift from a descriptive (idiographic) geography to an empirical law-making (nomothetic) geography. The quantitative revolution occurred during the 1950s and 1960s and marked a rapid change in the method behind geographical research, from regional geography into a spatial science.
Spatial analysis is any of the formal techniques which studies entities using their topological, geometric, or geographic properties. Spatial analysis includes a variety of techniques using different analytic approaches, especially spatial statistics. It may be applied in fields as diverse as astronomy, with its studies of the placement of galaxies in the cosmos, or to chip fabrication engineering, with its use of "place and route" algorithms to build complex wiring structures. In a more restricted sense, spatial analysis is geospatial analysis, the technique applied to structures at the human scale, most notably in the analysis of geographic data. It may also be applied to genomics, as in transcriptomics data.
Friction of distance is a core principle of geography that states that movement incurs some form of cost, in the form of physical effort, energy, time, and/or the expenditure of other resources, and that these costs are proportional to the distance traveled. This cost is thus a resistance against movement, analogous to the effect of friction against movement in classical mechanics. The subsequent preference for minimizing distance and its cost underlies a vast array of geographic patterns from economic agglomeration to wildlife migration, as well as many of the theories and techniques of spatial analysis, such as Tobler's first law of geography, network routing, and cost distance analysis. To a large degree, friction of distance is the primary reason why geography is relevant to many aspects of the world, although its importance has been decreasing with the development of transportation and communication technologies.
In the context of spatial analysis, geographic information systems, and geographic information science, a field is a property that fills space, and varies over space, such as temperature or density. This use of the term has been adopted from physics and mathematics, due to their similarity to physical fields (vector or scalar) such as the electromagnetic field or gravitational field. Synonymous terms include spatially dependent variable (geostatistics), statistical surface ( thematic mapping), and intensive property (physics and chemistry) and crossbreeding between these disciplines is common. The simplest formal model for a field is the function, which yields a single value given a point in space (i.e., t = f(x, y, z) )
Spatial heterogeneity is a property generally ascribed to a landscape or to a population. It refers to the uneven distribution of various concentrations of each species within an area. A landscape with spatial heterogeneity has a mix of concentrations of multiple species of plants or animals (biological), or of terrain formations (geological), or environmental characteristics filling its area. A population showing spatial heterogeneity is one where various concentrations of individuals of this species are unevenly distributed across an area; nearly synonymous with "patchily distributed."
Geography is the study of the lands, features, inhabitants, and phenomena of Earth. Geography is an all-encompassing discipline that seeks an understanding of Earth and its human and natural complexities—not merely where objects are, but also how they have changed and come to be. While geography is specific to Earth, many concepts can be applied more broadly to other celestial bodies in the field of planetary science. Geography has been called "a bridge between natural science and social science disciplines."
A boundary problem in analysis is a phenomenon in which geographical patterns are differentiated by the shape and arrangement of boundaries that are drawn for administrative or measurement purposes. The boundary problem occurs because of the loss of neighbors in analyses that depend on the values of the neighbors. While geographic phenomena are measured and analyzed within a specific unit, identical spatial data can appear either dispersed or clustered depending on the boundary placed around the data. In analysis with point data, dispersion is evaluated as dependent of the boundary. In analysis with areal data, statistics should be interpreted based upon the boundary.
Quantitative geography is a subfield and methodological approach to geography that develops, tests, and uses scientific, mathematical, and statistical methods to analyze and model geographic phenomena and patterns. It aims to explain and predict the distribution and dynamics of human and physical geography through the collection and analysis of quantifiable data. The approach quantitative geographers take is generally in line with the scientific method, where a falsifiable hypothesis is generated, and then tested through observational studies. This has received criticism, and in recent years, quantitative geography has moved to include systematic model creation and understanding the limits of their models. This approach is used to study a wide range of topics, including population demographics, urbanization, environmental patterns, and the spatial distribution of economic activity. The methods of quantitative geography are often contrasted by those employed by qualitative geography, which is more focused on observing and recording characteristics of geographic place. However, there is increasing interest in using combinations of both qualitative and quantitative methods through mixed-methods research to better understand and contextualize geographic phenomena.
In spatial analysis and geographic information systems, cost distance analysis or cost path analysis is a method for determining one or more optimal routes of travel through unconstrained (two-dimensional) space. The optimal solution is that which minimizes the total cost of the route, based on a field of cost density that varies over space due to local factors. It is thus based on the fundamental geographic principle of Friction of distance. It is an optimization problem with multiple deterministic algorithm solutions, implemented in most GIS software.
Giuseppe Arbia is an Italian statistician. He is known for his contributions to the field of spatial statistics and spatial econometrics. In 2006 together with Jean Paelinck he founded the Spatial Econometrics Association, which he has been chairing ever since.
The second law of geography, according to Waldo Tobler, is "the phenomenon external to a geographic area of interest affects what goes on inside." This is an extension of his first. He first published it in 1999 in reply to a paper titled "Linear pycnophylactic reallocation comment on a paper by D. Martin" and then again in response to criticism of his first law of geography titled "On the First Law of Geography: A Reply." Much of this criticism was centered on the question of if laws were meaningful in geography or any of the social sciences. In this document, Tobler proposed his second law while recognizing others have proposed other concepts to fill the role of 2nd law. Tobler asserted that this phenomenon is common enough to warrant the title of 2nd law of geography. Unlike Tobler's first law of geography, which is relatively well accepted among geographers, there are a few contenders for the title of the second law of geography. Tobler's second law of geography is less well known but still has profound implications for geography and spatial analysis.
Arbia’s law of geography states, "Everything is related to everything else, but things observed at a coarse spatial resolution are more related than things observed at a finer resolution." Originally proposed as the 2nd law of geography, this is one of several laws competing for that title. Because of this, Arbia's law is sometimes referred to as the second law of geography, or Arbia's second law of geography.
Technical geography is the branch of geography that involves using, studying, and creating tools to obtain, analyze, interpret, understand, and communicate spatial information.
The uncertain geographic context problem or UGCoP is a source of statistical bias that can significantly impact the results of spatial analysis when dealing with aggregate data. The UGCoP is very closely related to the Modifiable areal unit problem (MAUP), and like the MAUP, arises from how we divide the land into areal units. It is caused by the difficulty, or impossibility, of understanding how phenomena under investigation in different enumeration units interact between enumeration units, and outside of a study area over time. It is particularly important to consider the UGCoP within the discipline of time geography, where phenomena under investigation can move between spatial enumeration units during the study period. Examples of research that needs to consider the UGCoP include food access and human mobility.
Qualitative geography is a subfield and methodological approach to geography focusing on nominal data, descriptive information, and the subjective and interpretive aspects of how humans experience and perceive the world. Often, it is concerned with understanding the lived experiences of individuals and groups and the social, cultural, and political contexts in which those experiences occur. Thus, qualitative geography is traditionally placed under the branch of human geography; however, technical geographers are increasingly directing their methods toward interpreting, visualizing, and understanding qualitative datasets, and physical geographers employ nominal qualitative data as well as quanitative. Furthermore, there is increased interest in applying approaches and methods that are generally viewed as more qualitative in nature to physical geography, such as in critical physical geography. While qualitative geography is often viewed as the opposite of quantitative geography, the two sets of techniques are increasingly used to complement each other. Qualitative research can be employed in the scientific process to start the observation process, determine variables to include in research, validate results, and contextualize the results of quantitative research through mixed-methods approaches.
George Frederick Jenks (1916–1996) was an American geographer known for his significant contributions to cartography and geographic information systems (GIS). With a career spanning over three decades, Jenks played a vital role in advancing map-making technologies, was instrumental in enhancing the visualization of spatial data, and played foundational roles in developing modern cartographic curricula. The Jenks natural breaks optimization, based on his work, is still widely used in the creation of thematic maps, such as choropleth maps.
Waldo Tobler's publications span between 1957 and 2012, with his most productive year being 1973. Despite retirement in 1994, he continued to be involved with research for the remainder of his life. Most of his publications consist of peer-reviewed journals, without single-issue textbooks or monographs, and the quantity of publications is noted as being unremarkable compared to modern geographers. Many of his works are foundational to modern geography and cartography, and still frequently cited in modern publications, including the first paper on using computers in cartography, the establishment of analytical cartography, and coining Tobler's first and second laws of geography. His work covered a wide range of topics, with many of his papers considered to be "cartographic classics", that serve as required reading for both graduate and undergraduate students.