Forest-fire model

Last updated
Evolution of forest with p/f=100 Forest fire model.gif
Evolution of forest with p/f=100
960 frames of a forest fire simulation within 38.4 seconds (25 frames per second)

In applied mathematics, a forest-fire model is any of a number of dynamical systems displaying self-organized criticality. Note, however, that according to Pruessner et al. (2002, 2004) the forest-fire model does not behave critically on very large, i.e. physically relevant scales. Early versions go back to Henley (1989) and Drossel and Schwabl (1992). The model is defined as a cellular automaton on a grid with Ld cells. L is the sidelength of the grid and d is its dimension. A cell can be empty, occupied by a tree, or burning. The model of Drossel and Schwabl (1992) is defined by four rules which are executed simultaneously:

  1. A burning cell turns into an empty cell
  2. A tree will burn if at least one neighbor is burning
  3. A tree ignites with probability f even if no neighbor is burning
  4. An empty space fills with a tree with probability p

The controlling parameter of the model is p/f which gives the average number of trees planted between two lightning strikes (see Schenk et al. (1996) and Grassberger (1993)). In order to exhibit a fractal frequency-size distribution of clusters a double separation of time scales is necessary

where Tsmax is the burn time of the largest cluster. The scaling behavior is not simple, however (Grassberger 1993, 2002 and Pruessner et al. 2002, 2004).

A cluster is defined as a coherent set of cells, all of which have the same state. Cells are coherent if they can reach each other via nearest neighbor relations. In most cases, the von Neumann neighborhood (four adjacent cells) is considered.

The first condition allows large structures to develop, while the second condition keeps trees from popping up alongside a cluster while burning.

In landscape ecology, the forest fire model is used to illustrate the role of the fuel mosaic in the wildfire regime. The importance of the fuel mosaic on wildfire spread is debated. Parsimonious models such as the forest fire model can help to explore the role of the fuel mosaic and its limitations in explaining observed patterns.

Related Research Articles

In statistical mechanics, a universality class is a collection of mathematical models which share a single scale-invariant limit under the process of renormalization group flow. While the models within a class may differ dramatically at finite scales, their behavior will become increasingly similar as the limit scale is approached. In particular, asymptotic phenomena such as critical exponents will be the same for all models in the class.

<span class="mw-page-title-main">Jean-Pierre Vigier</span>

Jean-Pierre Vigier was a French theoretical physicist, known for his work on the foundations of physics, in particular on his stochastic interpretation of quantum physics.

<span class="mw-page-title-main">Bülent Atalay</span>

Bülent Atalay is a Turkish-American educator, author, scientist, and artist.

<span class="mw-page-title-main">Helimagnetism</span>

Helimagnetism is a form of magnetic ordering where spins of neighbouring magnetic moments arrange themselves in a spiral or helical pattern, with a characteristic turn angle of somewhere between 0 and 180 degrees. It results from the competition between ferromagnetic and antiferromagnetic exchange interactions. It is possible to view ferromagnetism and antiferromagnetism as helimagnetic structures with characteristic turn angles of 0 and 180 degrees respectively. Helimagnetic order breaks spatial inversion symmetry, as it can be either left-handed or right-handed in nature.

<span class="mw-page-title-main">Marvin L. Cohen</span> American physicist

Marvin Lou Cohen is an American–Canadian theoretical physicist. He is a physics professor at the University of California, Berkeley. Cohen is a leading expert in the field of condensed matter physics. He is widely known for his seminal work on the electronic structure of solids.

Delbrück scattering, the deflection of high-energy photons in the Coulomb field of nuclei as a consequence of vacuum polarization, was observed in 1975. The related process of the scattering of light by light, also a consequence of vacuum polarization, was not observed until 1998. In both cases, it is a process described by quantum electrodynamics.

<span class="mw-page-title-main">Percolation threshold</span> Threshold of percolation theory models

The percolation threshold is a mathematical concept in percolation theory that describes the formation of long-range connectivity in random systems. Below the threshold a giant connected component does not exist; while above it, there exists a giant component of the order of system size. In engineering and coffee making, percolation represents the flow of fluids through porous media, but in the mathematics and physics worlds it generally refers to simplified lattice models of random systems or networks (graphs), and the nature of the connectivity in them. The percolation threshold is the critical value of the occupation probability p, or more generally a critical surface for a group of parameters p1, p2, ..., such that infinite connectivity (percolation) first occurs.

<span class="mw-page-title-main">Feynman checkerboard</span> Fermion path integral approach in 1+1 dimensions

The Feynman checkerboard, or relativistic chessboard model, was Richard Feynman’s sum-over-paths formulation of the kernel for a free spin-½ particle moving in one spatial dimension. It provides a representation of solutions of the Dirac equation in (1+1)-dimensional spacetime as discrete sums.

In mathematics, in the area of quantum information geometry, the Bures metric or Helstrom metric defines an infinitesimal distance between density matrix operators defining quantum states. It is a quantum generalization of the Fisher information metric, and is identical to the Fubini–Study metric when restricted to the pure states alone.

A coupled map lattice (CML) is a dynamical system that models the behavior of nonlinear systems. They are predominantly used to qualitatively study the chaotic dynamics of spatially extended systems. This includes the dynamics of spatiotemporal chaos where the number of effective degrees of freedom diverges as the size of the system increases.

Hot spots in subatomic physics are regions of high energy density or temperature in hadronic or nuclear matter.

In information theory, the principle of minimum Fisher information (MFI) is a variational principle which, when applied with the proper constraints needed to reproduce empirically known expectation values, determines the best probability distribution that characterizes the system. (See also Fisher information.)

In the context of the physical and mathematical theory of percolation, a percolation transition is characterized by a set of universal critical exponents, which describe the fractal properties of the percolating medium at large scales and sufficiently close to the transition. The exponents are universal in the sense that they only depend on the type of percolation model and on the space dimension. They are expected to not depend on microscopic details such as the lattice structure, or whether site or bond percolation is considered. This article deals with the critical exponents of random percolation.

The minisuperspace in physics, especially in theories of quantum gravity, is an approximation of the otherwise infinite-dimensional phase space of a field theory. The phase space is reduced by considering the largest wavelength modes to be of the order of the size of the universe when studying cosmological models and removing all larger modes. The validity of this approximation holds as long as the adiabatic approximation holds.

Peter Grassberger is a retired professor who worked in statistical and particle physics. He made contributions to chaos theory, where he introduced the idea of correlation dimension, a means of measuring a type of fractal dimension of the strange attractor.

In theoretical physics, the logarithmic Schrödinger equation is one of the nonlinear modifications of Schrödinger's equation. It is a classical wave equation with applications to extensions of quantum mechanics, quantum optics, nuclear physics, transport and diffusion phenomena, open quantum systems and information theory, effective quantum gravity and physical vacuum models and theory of superfluidity and Bose–Einstein condensation. Its relativistic version was first proposed by Gerald Rosen. It is an example of an integrable model.

<span class="mw-page-title-main">Edward Ott</span> American physicist

Edward Ott is an American physicist most noted for his contributions to the development of chaos theory.

Physics of financial markets is a discipline that studies financial markets as physical systems. It seeks to understand the nature of financial processes and phenomena by employing the scientific method and avoiding beliefs, unverifiable assumptions and immeasurable notions, not uncommon to economic disciplines.

Periodic instantons are finite energy solutions of Euclidean-time field equations which communicate between two turning points in the barrier of a potential and are therefore also known as bounces. Vacuum instantons, normally simply called instantons, are the corresponding zero energy configurations in the limit of infinite Euclidean time. For completeness we add that ``sphalerons´´ are the field configurations at the very top of a potential barrier. Vacuum instantons carry a winding number, the other configurations do not. Periodic instantons were discovered with the explicit solution of Euclidean-time field equations for double-well potentials and the cosine potential with non-vanishing energy and are explicitly expressible in terms of Jacobian elliptic functions. Periodic instantons describe the oscillations between two endpoints of a potential barrier between two potential wells. The frequency of these oscillations or the tunneling between the two wells is related to the bifurcation or level splitting of the energies of states or wave functions related to the wells on either side of the barrier, i.e. . One can also interpret this energy change as the energy contribution to the well energy on either side originating from the integral describing the overlap of the wave functions on either side in the domain of the barrier.

Replica cluster move in condensed matter physics refers to a family of non-local cluster algorithms used to simulate spin glasses. It is an extension of the Swendsen-Wang algorithm in that it generates non-trivial spin clusters informed by the interaction states on two replicas instead of just one. It is different from the replica exchange method, as it performs a non-local update on a fraction of the sites between the two replicas at the same temperature, while parallel tempering directly exchanges all the spins between two replicas at different temperature. However, the two are often used alongside to achieve state-of-the-art efficiency in simulating spin-glass models.

References