Fractal analysis is assessing fractal characteristics of data. It consists of several methods to assign a fractal dimension and other fractal characteristics to a dataset which may be a theoretical dataset, or a pattern or signal extracted from phenomena including topography, [1] natural geometric objects, ecology and aquatic sciences, [2] sound, market fluctuations, [3] [4] [5] heart rates, [6] frequency domain in electroencephalography signals, [7] [8] digital images, [9] molecular motion, and data science. Fractal analysis is now widely used in all areas of science. [10] An important limitation of fractal analysis is that arriving at an empirically determined fractal dimension does not necessarily prove that a pattern is fractal; rather, other essential characteristics have to be considered. [11] Fractal analysis is valuable in expanding our knowledge of the structure and function of various systems, and as a potential tool to mathematically assess novel areas of study. Fractal calculus was formulated which is a generalization of ordinary calculus. [12]
Fractals have fractional dimensions, which are a measure of complexity that indicates the degree to which the objects fill the available space. [11] [13] The fractal dimension measures the change in "size" of a fractal set with the changing observational scale, and is not limited by integer values. [2] This is possible given that a smaller section of the fractal resembles the entirety, showing the same statistical properties at different scales. [11] This characteristic is termed scale invariance , and can be further categorized as self-similarity or self-affinity , the latter scaled anisotropically (depending on the direction). [2] Whether the view of the fractal is expanding or contracting, the structure remains the same and appears equivalently complex. [11] [13] Fractal analysis uses these underlying properties to help in the understanding and characterization of complex systems. It is also possible to expand the use of fractals to the lack of a single characteristic time scale, or pattern. [14]
Further information on the Origins: Fractal Geometry
There are various types of fractal analysis, including box counting, lacunarity analysis, mass methods, and multifractal analysis. [1] [3] [11] A common feature of all types of fractal analysis is the need for benchmark patterns against which to assess outputs. [15] These can be acquired with various types of fractal generating software capable of generating benchmark patterns suitable for this purpose, which generally differ from software designed to render fractal art. Other types include detrended fluctuation analysis and the Hurst absolute value method, which estimate the hurst exponent. [16] It is suggested to use more than one approach in order to compare results and increase the robustness of one's findings.
Unlike theoretical fractal curves which can be easily measured and the underlying mathematical properties calculated; natural systems are sources of heterogeneity and generate complex space-time structures that may only demonstrate partial self-similarity. [17] [18] [19] Using fractal analysis, it is possible to analyze and recognize when features of complex ecological systems are altered since fractals are able to characterize the natural complexity in such systems. [20] Thus, fractal analysis can help to quantify patterns in nature and to identify deviations from these natural sequences. It helps to improve our overall understanding of ecosystems and to reveal some of the underlying structural mechanisms of nature. [13] [21] [22] For example, it was found that the structure of an individual tree’s xylem follows the same architecture as the spatial distribution of the trees in the forest, and that the distribution of the trees in the forest shared the same underlying fractal structure as the branches, scaling identically to the point of being able to use the pattern of the trees’ branches mathematically to determine the structure of the forest stand. [23] [24] The use of fractal analysis for understanding structures, and spatial and temporal complexity in biological systems has already been well studied and its use continues to increase in ecological research. [25] [26] [27] [28] Despite its extensive use, it still receives some criticism. [29] [30]
Patterns in animal behaviour exhibit fractal properties on spatial and temporal scales. [16] Fractal analysis helps in understanding the behaviour of animals and how they interact with their environments on multiple scales in space and time. [2] Various animal movement signatures in their respective environments have been found to demonstrate spatially non-linear fractal patterns. [31] [32] This has generated ecological interpretations such as the Lévy Flight Foraging hypothesis, which has proven to be a more accurate description of animal movement for some species. [33] [34] [35]
Spatial patterns and animal behaviour sequences in fractal time have an optimal complexity range, which can be thought of as the homeostatic state on the spectrum where the complexity sequence should regularly fall. An increase or a loss in complexity, either becoming more stereotypical or conversely more random in their behaviour patterns, indicates that there has been an alteration in the functionality of the individual. [14] [36] Using fractal analysis, it is possible to examine the movement sequential complexity of animal behaviour and to determine whether individuals are experiencing deviations from their optimal range, suggesting a change in condition. [37] [38] For example, it has been used to assess welfare of domestic hens, [20] stress in bottlenose dolphins in response to human disturbance, [39] and parasitic infection in Japanese macaques [38] and sheep. [37] The research is furthering the field of behavioural ecology by simplifying and quantifying very complex relationships. [40] When it comes to animal welfare and conservation, fractal analysis makes it possible to identify potential sources of stress on animal behaviour, stressors that may not always be discernible through classical behaviour research. [20] [41] [42]
This approach is more objective than classical behaviour measurements, such as frequency-based observations that are limited by the counts of behaviours, but is able to delve into the underlying reason for the behaviour. [36] Another important advantage of fractal analysis is the ability to monitor the health of wild and free-ranging animal populations in their natural habitats without invasive measurements.
Applications of fractal analysis include: [43]
|
|
Chaos theory is an interdisciplinary area of scientific study and branch of mathematics. It focuses on underlying patterns and deterministic laws of dynamical systems that are highly sensitive to initial conditions. These were once thought to have completely random states of disorder and irregularities. Chaos theory states that within the apparent randomness of chaotic complex systems, there are underlying patterns, interconnection, constant feedback loops, repetition, self-similarity, fractals and self-organization. The butterfly effect, an underlying principle of chaos, describes how a small change in one state of a deterministic nonlinear system can result in large differences in a later state. A metaphor for this behavior is that a butterfly flapping its wings in Brazil can cause a tornado in Texas.
In mathematics, a fractal is a geometric shape containing detailed structure at arbitrarily small scales, usually having a fractal dimension strictly exceeding the topological dimension. Many fractals appear similar at various scales, as illustrated in successive magnifications of the Mandelbrot set. This exhibition of similar patterns at increasingly smaller scales is called self-similarity, also known as expanding symmetry or unfolding symmetry; if this replication is exactly the same at every scale, as in the Menger sponge, the shape is called affine self-similar. Fractal geometry lies within the mathematical branch of measure theory.
In mathematics, a self-similar object is exactly or approximately similar to a part of itself. Many objects in the real world, such as coastlines, are statistically self-similar: parts of them show the same statistical properties at many scales. Self-similarity is a typical property of fractals. Scale invariance is an exact form of self-similarity where at any magnification there is a smaller piece of the object that is similar to the whole. For instance, a side of the Koch snowflake is both symmetrical and scale-invariant; it can be continually magnified 3x without changing shape. The non-trivial similarity evident in fractals is distinguished by their fine structure, or detail on arbitrarily small scales. As a counterexample, whereas any portion of a straight line may resemble the whole, further detail is not revealed.
A complex system is a system composed of many components which may interact with each other. Examples of complex systems are Earth's global climate, organisms, the human brain, infrastructure such as power grid, transportation or communication systems, complex software and electronic systems, social and economic organizations, an ecosystem, a living cell, and, ultimately, for some authors, the entire universe.
In mathematics, a fractal dimension is a term invoked in the science of geometry to provide a rational statistical index of complexity detail in a pattern. A fractal pattern changes with the scale at which it is measured. It is also a measure of the space-filling capacity of a pattern, and it tells how a fractal scales differently, in a fractal (non-integer) dimension.
A Lévy flight is a random walk in which the step-lengths have a stable distribution, a probability distribution that is heavy-tailed. When defined as a walk in a space of dimension greater than one, the steps made are in isotropic random directions. Later researchers have extended the use of the term "Lévy flight" to also include cases where the random walk takes place on a discrete grid rather than on a continuous space.
The metabolic theory of ecology (MTE) is the ecological component of the more general Metabolic Scaling Theory and Kleiber's law. It posits that the metabolic rate of organisms is the fundamental biological rate that governs most observed patterns in ecology. MTE is part of a larger set of theory known as metabolic scaling theory that attempts to provide a unified theory for the importance of metabolism in driving pattern and process in biology from the level of cells all the way to the biosphere.
Self-organized criticality (SOC) is a property of dynamical systems that have a critical point as an attractor. Their macroscopic behavior thus displays the spatial or temporal scale-invariance characteristic of the critical point of a phase transition, but without the need to tune control parameters to a precise value, because the system, effectively, tunes itself as it evolves towards criticality.
The rangeomorphs are a form taxon of frondose Ediacaran fossils that are united by a similarity to Rangea. Some researchers, such as Pflug and Narbonne, suggest that a natural taxon Rangeomorpha may include all similar-looking fossils. Rangeomorphs appear to have had an effective reproductive strategy, based on analysis of the distribution pattern of Fractofusus, which consisted of sending out a waterborne asexual propagule to a distant area, and then spreading rapidly from there, just as plants today spread by stolons or runners.
In radiography, X-ray microtomography uses X-rays to create cross-sections of a physical object that can be used to recreate a virtual model without destroying the original object. It is similar to tomography and X-ray computed tomography. The prefix micro- is used to indicate that the pixel sizes of the cross-sections are in the micrometre range. These pixel sizes have also resulted in creation of its synonyms high-resolution X-ray tomography, micro-computed tomography, and similar terms. Sometimes the terms high-resolution computed tomography (HRCT) and micro-CT are differentiated, but in other cases the term high-resolution micro-CT is used. Virtually all tomography today is computed tomography.
The dynamic energy budget (DEB) theory is a formal metabolic theory which provides a single quantitative framework to dynamically describe the aspects of metabolism of all living organisms at the individual level, based on assumptions about energy uptake, storage, and utilization of various substances. The DEB theory adheres to stringent thermodynamic principles, is motivated by universally observed patterns, is non-species specific, and links different levels of biological organization as prescribed by the implications of energetics. Models based on the DEB theory have been successfully applied to over 1000 species with real-life applications ranging from conservation, aquaculture, general ecology, and ecotoxicology. The theory is contributing to the theoretical underpinning of the emerging field of metabolic ecology.
A multifractal system is a generalization of a fractal system in which a single exponent is not enough to describe its dynamics; instead, a continuous spectrum of exponents is needed.
The Hurst exponent is used as a measure of long-term memory of time series. It relates to the autocorrelations of the time series, and the rate at which these decrease as the lag between pairs of values increases. Studies involving the Hurst exponent were originally developed in hydrology for the practical matter of determining optimum dam sizing for the Nile river's volatile rain and drought conditions that had been observed over a long period of time. The name "Hurst exponent", or "Hurst coefficient", derives from Harold Edwin Hurst (1880–1978), who was the lead researcher in these studies; the use of the standard notation H for the coefficient also relates to his name.
In stochastic processes, chaos theory and time series analysis, detrended fluctuation analysis (DFA) is a method for determining the statistical self-affinity of a signal. It is useful for analysing time series that appear to be long-memory processes or 1/f noise.
In physical cosmology, fractal cosmology is a set of minority cosmological theories which state that the distribution of matter in the Universe, or the structure of the universe itself, is a fractal across a wide range of scales. More generally, it relates to the usage or appearance of fractals in the study of the universe and matter. A central issue in this field is the fractal dimension of the universe or of matter distribution within it, when measured at very large or very small scales.
An ecological network is a representation of the biotic interactions in an ecosystem, in which species (nodes) are connected by pairwise interactions (links). These interactions can be trophic or symbiotic. Ecological networks are used to describe and compare the structures of real ecosystems, while network models are used to investigate the effects of network structure on properties such as ecosystem stability.
Lacunarity, from the Latin lacuna, meaning "gap" or "lake", is a specialized term in geometry referring to a measure of how patterns, especially fractals, fill space, where patterns having more or larger gaps generally have higher lacunarity. Beyond being an intuitive measure of gappiness, lacunarity can quantify additional features of patterns such as "rotational invariance" and more generally, heterogeneity. This is illustrated in Figure 1 showing three fractal patterns. When rotated 90°, the first two fairly homogeneous patterns do not appear to change, but the third more heterogeneous figure does change and has correspondingly higher lacunarity. The earliest reference to the term in geometry is usually attributed to Benoit Mandelbrot, who, in 1983 or perhaps as early as 1977, introduced it as, in essence, an adjunct to fractal analysis. Lacunarity analysis is now used to characterize patterns in a wide variety of fields and has application in multifractal analysis in particular.
A social network is a social structure consisting of a set of social actors, sets of dyadic ties, and other social interactions between actors. The social network perspective provides a set of methods for analyzing the structure of whole social entities as well as a variety of theories explaining the patterns observed in these structures. The study of these structures uses social network analysis to identify local and global patterns, locate influential entities, and examine network dynamics. For instance, social network analysis has been used in studying the spread of misinformation on social media platforms or analyzing the influence of key figures in social networks.
Box counting is a method of gathering data for analyzing complex patterns by breaking a dataset, object, image, etc. into smaller and smaller pieces, typically "box"-shaped, and analyzing the pieces at each smaller scale. The essence of the process has been compared to zooming in or out using optical or computer based methods to examine how observations of detail change with scale. In box counting, however, rather than changing the magnification or resolution of a lens, the investigator changes the size of the element used to inspect the object or pattern. Computer based box counting algorithms have been applied to patterns in 1-, 2-, and 3-dimensional spaces. The technique is usually implemented in software for use on patterns extracted from digital media, although the fundamental method can be used to investigate some patterns physically. The technique arose out of and is used in fractal analysis. It also has application in related fields such as lacunarity and multifractal analysis.
Head/tail breaks is a clustering algorithm for data with a heavy-tailed distribution such as power laws and lognormal distributions. The heavy-tailed distribution can be simply referred to the scaling pattern of far more small things than large ones, or alternatively numerous smallest, a very few largest, and some in between the smallest and largest. The classification is done through dividing things into large and small things around the arithmetic mean or average, and then recursively going on for the division process for the large things or the head until the notion of far more small things than large ones is no longer valid, or with more or less similar things left only. Head/tail breaks is not just for classification, but also for visualization of big data by keeping the head, since the head is self-similar to the whole. Head/tail breaks can be applied not only to vector data such as points, lines and polygons, but also to raster data like digital elevation model (DEM).