Grey relational analysis (GRA) was developed by Deng Julong of Huazhong University of Science and Technology. It is one of the most widely used models of grey system theory. GRA uses a specific concept of information. It defines situations with no information as black, and those with perfect information as white. However, neither of these idealized situations ever occurs in real world problems. In fact, situations between these extremes, which contain partial information, are described as being grey, hazy or fuzzy. A variant of GRA model, Taguchi-based GRA model, is a popular optimization method in manufacturing engineering.
Let is an ideal data set and are the alternative data sets of the same length. The Grey Relational Grade (GRG) between the two data sets is given by [1]
where the Grey Relational Coefficients (GRC) is
where, is the weight of the elements of the data sets, and is needed when the GRA method is used to solve multiple criteria decision-making problems. Here, denotes the Dynamic Distinguishing Coefficient. Thus, the GRA model defined in this way is called Dynamic Grey Relational Analysis (Dynamic GRA) model. It is the generalized form of Deng's GRA model.
GRA is an important part of grey system theory, pioneered by Deng Julong in 1982. [2] A grey system means that a system in which part of information is known and part of information is unknown. Formally, grey systems theory describes uncertainty by interval-valued unknowns called grey numbers, with the width of the interval reflecting more or less precise knowledge. [3] With this definition, information quantity and quality form a continuum from a total lack of information to complete information – from black through grey to white. Since uncertainty always exists, one is always somewhere in the middle, somewhere between the extremes, somewhere in the grey area. Grey analysis then comes to a clear set of statements about system solutions [ specify ]. At one extreme, no solution can be defined for a system with no information. At the other extreme, a system with perfect information has a unique solution. In the middle, grey systems will give a variety of available solutions. Grey relational analysis does not attempt to find the best solution, but does provide techniques for determining a good solution, an appropriate solution for real-world problems. The theory inspired many noted scholars and business leaders like Jeffrey Yi-Lin Forrest, Liu Sifeng, Ren Zhengfei and Joseph L. Badaracco, a professor at Harvard Business School.
The theory has been applied in various fields of engineering and management. Initially, the grey method was adapted to effectively study air pollution [4] and subsequently used to investigate the nonlinear multiple-dimensional model of the socio-economic activities’ impact on the city air pollution. [5] It has also been used to study the research output and growth of countries. [6]
In the world, there are many universities, associations and societies promoting grey system theory e.g., International Association of Grey Systems and Decision Sciences (IAGSUA), Chinese Grey System Association (CGSA), Grey Systems Society of China (GSSC), Grey Systems Society of Pakistan (GSSP), Polish Scientific Society of Grey Systems (PSGS), Grey Systems Committee (IEEE Systems, Man, and Cybernetics Society), Centre for Computational Intelligence (De Montfort University), etc. [7] [8] [9] [10] [11]
There are several journals dedicated to grey systems research and studies e.g., "The Journal of Grey System" (UK), [12] [13] "Grey Systems Theory and Application" (Emerald Group Publishing), [14] "International Journal of Grey Systems" (USA), [15] "Journal of Grey System" (Taiwan), [16] "The Grey Journal", [17] Journal of Intelligent and Fuzzy Systems, [18] Kybernetes , etc.
Quantum mechanics is a fundamental theory in physics that describes the behavior of nature at and below the scale of atoms. It is the foundation of all quantum physics, which includes quantum chemistry, quantum field theory, quantum technology, and quantum information science.
The uncertainty principle, also known as Heisenberg's indeterminacy principle, is a fundamental concept in quantum mechanics. It states that there is a limit to the precision with which certain pairs of physical properties, such as position and momentum, can be simultaneously known. In other words, the more accurately one property is measured, the less accurately the other property can be known.
In mathematical analysis, the Dirac delta function, also known as the unit impulse, is a generalized function on the real numbers, whose value is zero everywhere except at zero, and whose integral over the entire real line is equal to one. Since there is no function having this property, to model the delta "function" rigorously involves the use of limits or, as is common in mathematics, measure theory and the theory of distributions.
In physics, engineering and mathematics, the Fourier transform (FT) is an integral transform that takes as input a function and outputs another function that describes the extent to which various frequencies are present in the original function. The output of the transform is a complex-valued function of frequency. The term Fourier transform refers to both this complex-valued function and the mathematical operation. When a distinction needs to be made the Fourier transform is sometimes called the frequency domain representation of the original function. The Fourier transform is analogous to decomposing the sound of a musical chord into the intensities of its constituent pitches.
A logistic function or logistic curve is a common S-shaped curve with the equation
For statistics and control theory, Kalman filtering, also known as linear quadratic estimation (LQE), is an algorithm that uses a series of measurements observed over time, including statistical noise and other inaccuracies, and produces estimates of unknown variables that tend to be more accurate than those based on a single measurement alone, by estimating a joint probability distribution over the variables for each timeframe. The filter is named after Rudolf E. Kálmán, who was one of the primary developers of its theory.
In the field of mathematical optimization, stochastic programming is a framework for modeling optimization problems that involve uncertainty. A stochastic program is an optimization problem in which some or all problem parameters are uncertain, but follow known probability distributions. This framework contrasts with deterministic optimization, in which all problem parameters are assumed to be known exactly. The goal of stochastic programming is to find a decision which both optimizes some criteria chosen by the decision maker, and appropriately accounts for the uncertainty of the problem parameters. Because many real-world decisions involve uncertainty, stochastic programming has found applications in a broad range of areas ranging from finance to transportation to energy optimization.
Multiple-criteria decision-making (MCDM) or multiple-criteria decision analysis (MCDA) is a sub-discipline of operations research that explicitly evaluates multiple conflicting criteria in decision making. It is also known as multiple attribute utility theory, multiple attribute value theory, multiple attribute preference theory, and multi-objective decision analysis.
Particle filters, or sequential Monte Carlo methods, are a set of Monte Carlo algorithms used to find approximate solutions for filtering problems for nonlinear state-space systems, such as signal processing and Bayesian statistical inference. The filtering problem consists of estimating the internal states in dynamical systems when partial observations are made and random perturbations are present in the sensors as well as in the dynamical system. The objective is to compute the posterior distributions of the states of a Markov process, given the noisy and partial observations. The term "particle filters" was first coined in 1996 by Pierre Del Moral about mean-field interacting particle methods used in fluid mechanics since the beginning of the 1960s. The term "Sequential Monte Carlo" was coined by Jun S. Liu and Rong Chen in 1998.
In probability theory and statistics, the generalized extreme value (GEV) distribution is a family of continuous probability distributions developed within extreme value theory to combine the Gumbel, Fréchet and Weibull families also known as type I, II and III extreme value distributions. By the extreme value theorem the GEV distribution is the only possible limit distribution of properly normalized maxima of a sequence of independent and identically distributed random variables. Note that a limit distribution needs to exist, which requires regularity conditions on the tail of the distribution. Despite this, the GEV distribution is often used as an approximation to model the maxima of long (finite) sequences of random variables.
An eikonal equation is a non-linear first-order partial differential equation that is encountered in problems of wave propagation.
A Belinski–Khalatnikov–Lifshitz (BKL) singularity is a model of the dynamic evolution of the universe near the initial gravitational singularity, described by an anisotropic, chaotic solution of the Einstein field equation of gravitation. According to this model, the universe is chaotically oscillating around a gravitational singularity in which time and space become equal to zero or, equivalently, the spacetime curvature becomes infinitely big. This singularity is physically real in the sense that it is a necessary property of the solution, and will appear also in the exact solution of those equations. The singularity is not artificially created by the assumptions and simplifications made by the other special solutions such as the Friedmann–Lemaître–Robertson–Walker, quasi-isotropic, and Kasner solutions.
In probability theory, heavy-tailed distributions are probability distributions whose tails are not exponentially bounded: that is, they have heavier tails than the exponential distribution. In many applications it is the right tail of the distribution that is of interest, but a distribution may have a heavy left tail, or both tails may be heavy.
In statistics, the generalized Pareto distribution (GPD) is a family of continuous probability distributions. It is often used to model the tails of another distribution. It is specified by three parameters: location , scale , and shape . Sometimes it is specified by only scale and shape and sometimes only by its shape parameter. Some references give the shape parameter as .
Reaction–diffusion systems are mathematical models that correspond to several physical phenomena. The most common is the change in space and time of the concentration of one or more chemical substances: local chemical reactions in which the substances are transformed into each other, and diffusion which causes the substances to spread out over a surface in space.
Peridynamics is a non-local formulation of continuum mechanics that is oriented toward deformations with discontinuities, especially fractures. Originally, bond-based peridynamic has been introduced, wherein, internal interaction forces between a material point and all the other ones with which it can interact, are modeled as a central forces field. This type of force fields can be imagined as a mesh of bonds connecting each point of the body with every other interacting point within a certain distance which depends on material property, called peridynamic horizon. Later, to overcome bond-based framework limitations for the material Poisson’s ratio, state-base peridynamics, has been formulated. Its characteristic feature is that the force exchanged between a point and another one is influenced by the deformation state of all other bonds relative to its interaction zone.
In fluid dynamics, a cnoidal wave is a nonlinear and exact periodic wave solution of the Korteweg–de Vries equation. These solutions are in terms of the Jacobi elliptic function cn, which is why they are coined cnoidal waves. They are used to describe surface gravity waves of fairly long wavelength, as compared to the water depth.
Deng Julong was a professor of Huazhong University of Science and Technology, Wuhan, China. He is acknowledged as the founder of grey system theory, first proposed in 1982 with the publication of his paper “Control problems of grey systems,” in the international journal Systems and Control Letter, edited at the time by Roger W. Brockett, a professor at Harvard University. This theory underlies the theory of grey relational analysis. His theory inspired many noted scholars like Jeffrey Yi-Lin Forrest, Liu Sifeng, Wang Jianwei, and Keith W. Hipel, recipient of Izaak Walton Killam Memorial Prize.
Liu Sifeng is a Chinese systems engineer. He was the director of the Institute for Grey Systems Studies at Nanjing University of Aeronautics and Astronautics, Nanjing, China. He is best known for his work on grey system theory.
Ordinal priority approach (OPA) is a multiple-criteria decision analysis method that aids in solving the group decision-making problems based on preference relations.